Tag Archives: Data

Tackling Big Data on Police Use of Force #ABQ #NM #publicsafety #data

According to a review recently put out by the the Police Executive Research Forum (PERF), the Albuquerque Police Department (APD) was faced with a riddle. In recent years, violent crime and assaults on officers have been declining, training has been adapted to focus more on deescalating conflicts and avoiding physical confrontations, the department procured and trained officers on less-lethal weaponry, a non-disciplinary Early Warning System was put in place to identify officers with warning signs of excessive use of force based on recent incidents, and a Crisis Intervention Team created to help deal with mentally ill persons, which are present in 65% of officer-involved shootings nationwide. Still, Albuquerque’s rate of police-suspect encounters involving use of force remained unusually high relative to comparable cities.

Eager to find out what was going wrong, APD collected all the data they could. Already, any police use of force, from grabbing a suspect to a fatal gun battle, was reported and reviewed by numerous committees through the police hierarchy, Internal Affairs Division, and civilian government. As anyone involved with law enforcement will tell you, reporting and record-keeping is extensive. On top of that, APD monitored their officers with audio recorders on the gunbelts and video recorders on the shirts of every patrol officer, as well as dashboard cameras on squad cars involved in higher-risk operations. They even instituted a database that signalled when an officer was involved in activities indicating risk too often. APD pored over this information to revise training and procedure to little avail, as use of force rates remained steady.

The Albuquerque Police Department is not the only one confused over use of force. Nationwide, experts agree that despite extensive writing on the subject and software to track incidents, use of force, and especially wrongful use of force, is poorly understood due to gaps in data and inconsistencies in reporting between departments. There are many “snapshots” of use of force from select departments, studies, and reports, some contradictory, but no big picture. Even Albuquerque, where the police department is in line with the most progressive practices on monitoring violence, suffers from data gaps and faulty reporting procedures, causing PERF to recommend better checking and reporting of use of force, documenting the response and outcomes of the Crisis Intervention Team, more follow-up reports on responses and outcomes, and more integration of the various monitoring and reporting mechanisms in place.

One way to achieve these goals is to bring all the data on police use of force together, rather than just storing a count of incidents in a database. Ideally, a use of force database would include data from police departments across the country, like the FBI’s national finger print database IAFIS or its Combined DNA Indexing System CODIS, which would allow towns, cities, counties, and states to learn from each other, share best practices, and improve reporting and monitoring nationwide. Even for one city like Albuquerque, however, this would be a Big Data challenge.

Consider all the information that can be pooled to provide a more complete view of use of force incidents. On top of a description the the actual incident, it would be beneficial to have dispatcher data to see what the officer knew going in and how he or she was dispatched. Of course, the police report would be included, as well as any other reports and findings around the use of force, which for a shooting, fatal or not, would include the findings of a Multi-Jurisdictional Investigative team, a Homicide Unit investigation, an Internal Affairs Investigation, interviews with psychologists, and reviews by the Grand Jury and the city’s Independent Review Officer. In an effort to see the citizen’s and suspect’s perspective, complaints would be included in the database, as well as any relevant newspaper articles. To get a better understanding of the officers, their records, commendations, citations, and training would be included. Aside from this mass of text, a whole host of audio-visual recordings would be used to objectively chronicle the incident, including the audio from the officer’s gun belt, the video from his or her uniform, and any relevant footage from a dash-cam or video surveillance camera in the area.

Storing and analyzing this information would be no small feat. The data is complex and unstructured, stored in different formats from text to indexed reports to audio and video. If other police departments were to be included, even more formats would arise. The data is also massive. To give an idea of the scale, APD alone produces about 9,000 police reports a year for 45,000 calls for service and makes about 1,200 felony arrests. Nationwide, fewer than 2% of interactions result in use of force, but assuming 2% for the 45,000 calls for service results in 900 use of force incidents every year. Though daunting, this would be a challenge worth taking. In fields from business to medicine and intelligence, better and more holistic data, when analyzed, has revealed conclusions contrary to anecdotal evidence and practitioner impressions. Even in law enforcement, extensive data analysis on old problems has yielded shocking results, such as correlations between moon phase and crime, later attributed to how bright it was at night. Bringing together traditional reports with evaluations and reviews can add depth and significance, including more non-police observations and commentary adds perspective, and audio and video provide a means of verifying and the information.

Storing and analyzing this Big Data cheaply and effectively is not impossible thanks to advances in data science driven by business intelligence. Apache Hadoop, for example, can store and process massive amounts of unstructured data in its original form for pennies on the dollar as it is open source, requiring no licensing fees, and runs on clusters of commodity hardware instead of supercomputers. It also provides a platform for complex and evolving analysis with a variety of open source software.

This analysis can then automatically sift through the terabytes to petabytes of data to find trends and warning signs with much more depth than the current system that only looks at counts of certain incidents. Department-wide trends can be generated and when an instance or officer is being reviewed, relevant data can be pulled from the system. Already, a host of analytics exist to help with text, audio, and video analysis, some designed specifically for Hadoop. IBM‘s Watson, for example, used textual analysis to defeat human contestants on Jeapordy!, generating questions for the answers on written on the game board in real time by pouring through a virtual library. Other software can analyze audio recordings for emotional state and signs of aggression, and Video Content Analysis can flag suspected use of force and erratic or suspicious behavior. Products like piXserve can automatically index and search video. Machine learning tools like Mahout, developed to work with Hadoop, can pick up trends that analysts weren’t even examining. And capabilities like the full stack of Hadoop-related components in the Cloudera Distribution including Apache Hadoop (CDH) make the entire stack one of high-performing, high-availability capabilities that deliver on both analytical and management tools.

All together, a Big Data analysis approach to police records would mean more than decreasing gaps and checking accounts. Even the PERF report, which combined interviews with extensive statistical analysis, wasn’t able to pinpoint where the problem was for Albuquerque, instead suggesting 40 small changes that may together lead to possitive trends. Most are slight improvements to procedure, some as minor as proposed name changes. Overall, APD seemed to have done everything right, leaving even the experts with what the intelligence community after 9/11 termed a “failure of imagination.” When done well, however, data analysis can find trends that analysts could never imagine and search for. For example, there may be a telling correlation between an officer’s geographic location, the aggression in his or her voice, the and the outcome of the interaction. As a result, the use of Big Data storage and analysis tools like Apache Hadoop can revolutionize not only police record keeping, but, through the findings, the practice of policing itself.

Related articles

Tackling Big Data on Police Use of Force #ABQ #NM #publicsafety #data

According to a review recently put out by the the Police Executive Research Forum (PERF), the Albuquerque Police Department (APD) was faced with a riddle. In recent years, violent crime and assaults on officers have been declining, training has been adapted to focus more on deescalating conflicts and avoiding physical confrontations, the department procured and trained officers on less-lethal weaponry, a non-disciplinary Early Warning System was put in place to identify officers with warning signs of excessive use of force based on recent incidents, and a Crisis Intervention Team created to help deal with mentally ill persons, which are present in 65% of officer-involved shootings nationwide. Still, Albuquerque’s rate of police-suspect encounters involving use of force remained unusually high relative to comparable cities.

Eager to find out what was going wrong, APD collected all the data they could. Already, any police use of force, from grabbing a suspect to a fatal gun battle, was reported and reviewed by numerous committees through the police hierarchy, Internal Affairs Division, and civilian government. As anyone involved with law enforcement will tell you, reporting and record-keeping is extensive. On top of that, APD monitored their officers with audio recorders on the gunbelts and video recorders on the shirts of every patrol officer, as well as dashboard cameras on squad cars involved in higher-risk operations. They even instituted a database that signalled when an officer was involved in activities indicating risk too often. APD pored over this information to revise training and procedure to little avail, as use of force rates remained steady.

The Albuquerque Police Department is not the only one confused over use of force. Nationwide, experts agree that despite extensive writing on the subject and software to track incidents, use of force, and especially wrongful use of force, is poorly understood due to gaps in data and inconsistencies in reporting between departments. There are many “snapshots” of use of force from select departments, studies, and reports, some contradictory, but no big picture. Even Albuquerque, where the police department is in line with the most progressive practices on monitoring violence, suffers from data gaps and faulty reporting procedures, causing PERF to recommend better checking and reporting of use of force, documenting the response and outcomes of the Crisis Intervention Team, more follow-up reports on responses and outcomes, and more integration of the various monitoring and reporting mechanisms in place.

One way to achieve these goals is to bring all the data on police use of force together, rather than just storing a count of incidents in a database. Ideally, a use of force database would include data from police departments across the country, like the FBI’s national finger print database IAFIS or its Combined DNA Indexing System CODIS, which would allow towns, cities, counties, and states to learn from each other, share best practices, and improve reporting and monitoring nationwide. Even for one city like Albuquerque, however, this would be a Big Data challenge.

Consider all the information that can be pooled to provide a more complete view of use of force incidents. On top of a description the the actual incident, it would be beneficial to have dispatcher data to see what the officer knew going in and how he or she was dispatched. Of course, the police report would be included, as well as any other reports and findings around the use of force, which for a shooting, fatal or not, would include the findings of a Multi-Jurisdictional Investigative team, a Homicide Unit investigation, an Internal Affairs Investigation, interviews with psychologists, and reviews by the Grand Jury and the city’s Independent Review Officer. In an effort to see the citizen’s and suspect’s perspective, complaints would be included in the database, as well as any relevant newspaper articles. To get a better understanding of the officers, their records, commendations, citations, and training would be included. Aside from this mass of text, a whole host of audio-visual recordings would be used to objectively chronicle the incident, including the audio from the officer’s gun belt, the video from his or her uniform, and any relevant footage from a dash-cam or video surveillance camera in the area.

Storing and analyzing this information would be no small feat. The data is complex and unstructured, stored in different formats from text to indexed reports to audio and video. If other police departments were to be included, even more formats would arise. The data is also massive. To give an idea of the scale, APD alone produces about 9,000 police reports a year for 45,000 calls for service and makes about 1,200 felony arrests. Nationwide, fewer than 2% of interactions result in use of force, but assuming 2% for the 45,000 calls for service results in 900 use of force incidents every year. Though daunting, this would be a challenge worth taking. In fields from business to medicine and intelligence, better and more holistic data, when analyzed, has revealed conclusions contrary to anecdotal evidence and practitioner impressions. Even in law enforcement, extensive data analysis on old problems has yielded shocking results, such as correlations between moon phase and crime, later attributed to how bright it was at night. Bringing together traditional reports with evaluations and reviews can add depth and significance, including more non-police observations and commentary adds perspective, and audio and video provide a means of verifying and the information.

Storing and analyzing this Big Data cheaply and effectively is not impossible thanks to advances in data science driven by business intelligence. Apache Hadoop, for example, can store and process massive amounts of unstructured data in its original form for pennies on the dollar as it is open source, requiring no licensing fees, and runs on clusters of commodity hardware instead of supercomputers. It also provides a platform for complex and evolving analysis with a variety of open source software.

This analysis can then automatically sift through the terabytes to petabytes of data to find trends and warning signs with much more depth than the current system that only looks at counts of certain incidents. Department-wide trends can be generated and when an instance or officer is being reviewed, relevant data can be pulled from the system. Already, a host of analytics exist to help with text, audio, and video analysis, some designed specifically for Hadoop. IBM‘s Watson, for example, used textual analysis to defeat human contestants on Jeapordy!, generating questions for the answers on written on the game board in real time by pouring through a virtual library. Other software can analyze audio recordings for emotional state and signs of aggression, and Video Content Analysis can flag suspected use of force and erratic or suspicious behavior. Products like piXserve can automatically index and search video. Machine learning tools like Mahout, developed to work with Hadoop, can pick up trends that analysts weren’t even examining. And capabilities like the full stack of Hadoop-related components in the Cloudera Distribution including Apache Hadoop (CDH) make the entire stack one of high-performing, high-availability capabilities that deliver on both analytical and management tools.

All together, a Big Data analysis approach to police records would mean more than decreasing gaps and checking accounts. Even the PERF report, which combined interviews with extensive statistical analysis, wasn’t able to pinpoint where the problem was for Albuquerque, instead suggesting 40 small changes that may together lead to possitive trends. Most are slight improvements to procedure, some as minor as proposed name changes. Overall, APD seemed to have done everything right, leaving even the experts with what the intelligence community after 9/11 termed a “failure of imagination.” When done well, however, data analysis can find trends that analysts could never imagine and search for. For example, there may be a telling correlation between an officer’s geographic location, the aggression in his or her voice, the and the outcome of the interaction. As a result, the use of Big Data storage and analysis tools like Apache Hadoop can revolutionize not only police record keeping, but, through the findings, the practice of policing itself.

Related articles

Google+ for Crowdsourcing Crisis Information, Crisis Mapping and Disaster Response (HT @peakwx)

Facebook is increasingly used to crowdsource crisis information and response, as is Twitter. So is it just a matter of time until we see similar use cases with Google+? Another question I have is whether such uses cases will simply reflect more of the same or whether we’ll see new, unexpected applications and dynamics? Of course, it may be premature to entertain the role that Google+ might play in disaster response just days after it’s private beta launch, but the company seems fully committed to making  this new venture succeed. Entertain-ing how Google+ (G+) might be used as a humanitarian technology thus seems worthwhile.

The fact that G+ is open and searchable is probably one of the starkest differences with the walled-garden that is Facebook; that, and their Data Liberation policy. This will make activity on G+ relatively easier to find—Google is the King of Search, after all. This openness will render serendipity and synergies more likely.

The much talked about “Circles” feature is also very appealing for the kind of organic and collaborative crowdsourcing work that we see emerging following a crisis. Think about these “Circles” not only as networks but also as “honeycombs” for “flash” projects—i.e., short-term and temporary—very much along the lines that Skype is used for live collaborative crisis mapping operations.

Google+’s new Hangout feature could also be used instead of Skype chat and video, with the advantage of having multi-person video-conferencing. With a little more work, the Sparks feature could facilitate media monitoring—an important component of live crisis mapping. And then there’s Google+ mobile, which is accessible on most phones with a browser and already includes a “check-in” feature as well as geo-referenced status updates. The native app for the Android is already available and the iPhone app is coming soon.

Clicking on my status update above, produces the Google Maps page below. What’s particularly telling about this is how “underwhelming” the use of Google Maps currently is within G+.  There’s no doubt this will change dramatically as G+ evolves. The Google+ team has noted that they already have dozens of new features ready to be rolled out in the coming months. So expect G+ to make full use of Google’s formidable presence on the Geo Web—think MapMaker+ and Earth Engine+. This could be a big plus for live crowdsourced crisis mapping, especially of the multimedia kind.

One stark difference with Facebook’s status updates and check-in’s is that G+ allows you to decide which Circles (or networks of contacts) to share your updates and check-in’s with. This is an important difference that could allow for more efficient information sharing in near real-time. You could set up your Circles as different teams, perhaps even along UN Cluster lines.

As the G+ mobile website reveals, the team will also be integrating SMS, which is definitely key for crisis response. I imagine there will also be a way to connect your Twitter feed with Google+ in the near future. This will make G+ even more compelling as a mobile humanitarian technology platform. In addition, I expect there are also plans to integrate Google News, Google Reader, Google Groups, Google Docs and Google Translate with G+. GMail, YouTube and Picasa are already integrated.

One feature that will be important for humanitarian applications is offline functionality. Google Reader and GMail already have this feature (Google Gears), which I imagine could be added to G+’s Stream and perhaps eventually with Google Maps? In addition, if Google can provide customizable uses of G+, then this could also make the new platform more compelling for humanitarian organizations, e.g., if OCHA could have their own G+ (“iG+”) by customizing and branding their G+ interface; much like the flexibility afforded by the Ning platform. One first step in that direction might be to offer a range of “themes” for G+, just like Google does with GMail.

Finally, the ability to develop third party apps for G+ could be a big win. Think of a G+ store (in contrast to an App Store). I’d love to see a G+ app for Ushahidi and OSM, for example.

If successful, G+ could be the best example of “What Technology Wants” to date. G+ is convergence technology par excellence. It is a hub that connects many of Google’s excellent products and from the looks of it, the G+ team is just getting warmed up with the converging.

I’d love to hear from others who are also brainstorming about possible applications of Google+ in the humanitarian space. Am I off on any of the ideas above? What am I missing? Maybe we could set up a Google+ 4 Disaster Response Circle and get on Hangout to brainstorm together?

Google+ for Crowdsourcing Crisis Information, Crisis Mapping and Disaster Response (HT @peakwx)

Facebook is increasingly used to crowdsource crisis information and response, as is Twitter. So is it just a matter of time until we see similar use cases with Google+? Another question I have is whether such uses cases will simply reflect more of the same or whether we’ll see new, unexpected applications and dynamics? Of course, it may be premature to entertain the role that Google+ might play in disaster response just days after it’s private beta launch, but the company seems fully committed to making  this new venture succeed. Entertain-ing how Google+ (G+) might be used as a humanitarian technology thus seems worthwhile.

The fact that G+ is open and searchable is probably one of the starkest differences with the walled-garden that is Facebook; that, and their Data Liberation policy. This will make activity on G+ relatively easier to find—Google is the King of Search, after all. This openness will render serendipity and synergies more likely.

The much talked about “Circles” feature is also very appealing for the kind of organic and collaborative crowdsourcing work that we see emerging following a crisis. Think about these “Circles” not only as networks but also as “honeycombs” for “flash” projects—i.e., short-term and temporary—very much along the lines that Skype is used for live collaborative crisis mapping operations.

Google+’s new Hangout feature could also be used instead of Skype chat and video, with the advantage of having multi-person video-conferencing. With a little more work, the Sparks feature could facilitate media monitoring—an important component of live crisis mapping. And then there’s Google+ mobile, which is accessible on most phones with a browser and already includes a “check-in” feature as well as geo-referenced status updates. The native app for the Android is already available and the iPhone app is coming soon.

Clicking on my status update above, produces the Google Maps page below. What’s particularly telling about this is how “underwhelming” the use of Google Maps currently is within G+.  There’s no doubt this will change dramatically as G+ evolves. The Google+ team has noted that they already have dozens of new features ready to be rolled out in the coming months. So expect G+ to make full use of Google’s formidable presence on the Geo Web—think MapMaker+ and Earth Engine+. This could be a big plus for live crowdsourced crisis mapping, especially of the multimedia kind.

One stark difference with Facebook’s status updates and check-in’s is that G+ allows you to decide which Circles (or networks of contacts) to share your updates and check-in’s with. This is an important difference that could allow for more efficient information sharing in near real-time. You could set up your Circles as different teams, perhaps even along UN Cluster lines.

As the G+ mobile website reveals, the team will also be integrating SMS, which is definitely key for crisis response. I imagine there will also be a way to connect your Twitter feed with Google+ in the near future. This will make G+ even more compelling as a mobile humanitarian technology platform. In addition, I expect there are also plans to integrate Google News, Google Reader, Google Groups, Google Docs and Google Translate with G+. GMail, YouTube and Picasa are already integrated.

One feature that will be important for humanitarian applications is offline functionality. Google Reader and GMail already have this feature (Google Gears), which I imagine could be added to G+’s Stream and perhaps eventually with Google Maps? In addition, if Google can provide customizable uses of G+, then this could also make the new platform more compelling for humanitarian organizations, e.g., if OCHA could have their own G+ (“iG+”) by customizing and branding their G+ interface; much like the flexibility afforded by the Ning platform. One first step in that direction might be to offer a range of “themes” for G+, just like Google does with GMail.

Finally, the ability to develop third party apps for G+ could be a big win. Think of a G+ store (in contrast to an App Store). I’d love to see a G+ app for Ushahidi and OSM, for example.

If successful, G+ could be the best example of “What Technology Wants” to date. G+ is convergence technology par excellence. It is a hub that connects many of Google’s excellent products and from the looks of it, the G+ team is just getting warmed up with the converging.

I’d love to hear from others who are also brainstorming about possible applications of Google+ in the humanitarian space. Am I off on any of the ideas above? What am I missing? Maybe we could set up a Google+ 4 Disaster Response Circle and get on Hangout to brainstorm together?

2011 Best Practices for Government Libraries Now Available – Government Info Pro

« 2011 ALA Annual Conference and Exhibit |
Main
| ALA 2011: In New Orleans »

2011 Best Practices for Government Libraries Now Available

The 2011 Best Practices for Government Libraries:  e-Initiatives and e-Efforts:  Expanding Our Horizons is now available:  2011 Best Practices for Government Libraries in PDFBest Practices for Government Libraries for 2011 and prior years are all available from the right sidebar here on the Government Info Pro.

Best Practices for Government Libraries 2011

Best Practices is a collaborative document that is put out annually on a specific topic of interest to government libraries and includes content submitted by government librarians and community leaders with an interest in government libraries. The 2011 edition includes over 70 articles and other submissions provided by more than 60 contributors including librarians in government agencies, courts, and the military, as well as from professional association leaders, LexisNexis Consultants, and more. As the editor of Best Practices for Government Libraries, I want to thank the contributors for sharing their knowledge, experience, and thoughtful perspectives in this year’s Best Practices. If you did not write for this year’s Best Practices, I invite you to submit a guest post for the Government Info Pro.

The 2011 Best Practices for Government Libraries:  e-Initiatives and e-Efforts:  Expanding Our Horizons is broken into six sections:

  • EMBRACING NEW AVENUES OF COMMUNICATION
  • ADAPTING TO NEW AND EVOLVNG TECHNOLOGIES
  • ALTERING OUR PLACES AND SPACES
  • TACKLING CHANGING EXPECTATIONS, RESOURCES, AND JOB DESCRIPTIONS
  • PRESERVING WHAT WE HAVE AND PREPARING FOR THE FUTURE
  • EXPANDING HORIZONS

Here is a sampling of the articles in each section:

EMBRACING NEW AVENUES OF COMMUNICATION

  • Blogging at the Largest Law Library in the World
    Christine Sellers, Legal Reference Specialist, and Andrew Weber, Legislative Information Systems Manager, Law Library of Congress
  • “Friended” by the Government? A Look at How Social Networking Tools Are Giving Americans Greater Access to Their Government
    Kate Follen, MLS, President, Monroe Information Services
  • Podcasts Get Information Junkies their Fix
    Chris Vestal, Supervisory Patent Researcher with ASRC Management Services, U.S. Patent and Trademark Office and DC/SLA‘s 2011 Communication Secretary
  • Getting the Most from Social Media from the Least Investment of Time and Energy
    Tammy Garrison, MLIS, Digitization Librarian at the Combined Arms Research Library at Fort Leavenworth, KS
  • Thinking Outside the Email Box: A New E-Newsletter for the Justice Libraries
    Kate Lanahan, Law Librarian, and Jennifer L. McMahan, Supervisory Librarian, U.S. Department of Justice
  • Bill’s Bulletin: Librarians and Court Staff Working Together to Develop an E-Resource
    Barbara Fritschel, U. S. Courts Library, Milwaukee, WI
  • Proletariat’s Speech: Foreign Language Learning with a Common Touch
    Janice P. Fridie, Law Librarian, U.S. Department of Justice
  • Social Media Comes Together with Storify
    Chris Zammarelli, Contract Cataloger on behalf of ATSG, LLC for the U.S. Department of State Bureau of International Information Programs, Office of Information Resources

ADAPTING TO NEW AND EVOLVNG TECHNOLOGIES

  • EBooks in Special Libraries: Final Report of the Federal Reserve System Libraries Work Group on EBooks
    Luke Mueller, Technical Librarian, Federal Reserve Bank of Philadelphia
  • Kindle Lending Programs in Libraries
    Montrese Hamilton, Librarian, Society for Human Resource Management
  • Intranet Case Study: Government Agency
    Lorette S.J. Weldon, MLS, BSIFSM, BA
  • Putting the E in Library
    David E. McBee, Federal Government Librarian, ww.librarybuzz.blogspot.com
  • Web E-Accessibility to Reach Full E-Audience: “Expanding Our Horizon” to Better Honor Diversity
    Ken Wheaton, Web Services Librarian, Alaska State Court System Law Library

ALTERING OUR PLACES AND SPACES

  • Embedded Librarianship and E-Initiatives: The Dynamic Duo
    Rachel Kingcade, Chief Reference & CSC Direct Support Librarian, USMC Research
  • Utilizing Electronic Databases During a Library Relocation
    George Franchois, Director, U.S. Dept. of the Interior Library
  • E-Reference at the Library of Congress
    Amber Paranick and Megan Halsband, Reference Librarians, Newspaper & Current Periodical Reading Room, Serial & Government Publications Division, Library of Congress
  • Best Practices: Telework
    Robert Farina, MSLIS, Entrepreneur, Minor Potentate of Logogrammatic Research & Analysis, Data Wrangler, etc.
  • Best Practices for Virtual Reference
    Susan Ujka Larson, MLIS
  • To Build a Virtual Embedded Information Role, Start at the Top
    Mary Talley, Owner, TalleyPartners, 2011 DC/SLA President

TACKLING CHANGING EXPECTATIONS, RESOURCES, AND JOB DESCRIPTIONS

  • Accidental Advisors: There’s GOT to Be a Better Way!
    Compiled by Nancy Faget and Jennifer McMahan (Eugenia Beh, Blane Dessy, Aimee Babcock-Ellis, Marianne Giltrud, Jessica Hernandez, Rich Louis, Virginia Sanchez)
  • I Need a Library Job: Finding and Filling a Need on the Fly
    Naomi House, Reference Librarian, Census Library
  • Rebranding the Library
    Julie Jones, Hartford Branch Librarian, U.S. Courts, Second Circuit Library
  • NIH Handheld User Group: Library-IT Collaboration
    James King, Information Architect, NIH Library
  • Cats and Dogs – Living Together: Leveraging IT Resources for Library Use>
    Sarah Mauldin, Head Librarian, Smith, Gambrell & Russell, LLP, Atlanta, GA
  • Broadband Plan and the Provision of Public Libraries
    Christian Jiménez Tomás, Information Specialist, The World Bank Law Resource
  • E-Gov Sites to Go Dark?
    Kim Schultz, Outreach Specialist at the NASA Center for AeroSpace Information, operated by Chugach Federal Solutions, Inc.
  • E-Gov on the Web: A Brief Summary of Electronic Access Through On-Line Resources
    Jennifer Klang, Head of Reference Services, Department of the Interior Library
  • Public Records Resources Online: How to Find Everything There Is to Know About “Mr./Ms. X”
    Jennifer L. McMahan, Supervisory Librarian, U.S. Department of Justice
  • The Challenge of E-Legislative History for the “51st State”
    Lisa Kosow, Law Librarian, U.S. Attorney‘s Office for the District of Columbia
  • E-Gov Resources on Native Americans and Tribal Issues
    Kathy Kelly, MSLS, C.A.
  • LexisNexis 2010 International Workplace Productivity Survey: Executive Summary of Results for Legal Professionals

PRESERVING WHAT WE HAVE AND PREPARING FOR THE FUTURE

  • Federal Libraries on the E-Horizon
    Blane K. Dessy, Executive Director, FLICC/FEDLINK, Library of Congress
  • Research Metrics: Measuring the Impact of Research
    James King, Information Architect, NIH Library
  • When I Walk Across My Library I Think…
    Edwin B. Burgess, Director, Combined Arms Research Library
  • E-Initiative Liberia: Creating a Legislative Library in the Rubble of War
    Mary Nell Bryant, M.A., M.L.S., U.S. Foreign Service Information Officer, retired
  • JustSearch at the Department of Justice
    Lila Faulkner, Diane L. Smith, and Jane Sanchez, Library Staff, U.S. Department of Justice Library Staff
  • Real Libraries, Virtual Fundraising
    Biblio Latte, Volunteer Reference Librarian, Community Virtual Library
  • Accessible Libraries: Ensuring All May Read
    Jane Caulton, Head, Publications and Media Section, NLS, Library of Congress
  • A Model Lessons Learned System – The US Army
    Nancy M. Dixon, Principal Researcher, Common Knowledge Associates

EXPANDING HORIZONS

  • Ten Scary Issues: Future Directions for Military Libraries
    Edwin B. Burgess, Director, Combined Arms Research Library
  • Future Ready 365
    Cindy Romaine, SLA President 2011
  • Expanding Horizons with E-Learning: VA Librarians Develop Online Tutorial for EBN Training
    Priscilla L. Stephenson, MSLS, MSEd, Philadelphia VA Medical Center, Philadelphia, PA; and Teresa R. Coady, MLS, VA Central Iowa Healthcare System, Des Moines, IA
  • Library Connect Newsletter: Information Industry Explorations by and for Librarians
    Colleen DeLory, Editor, Library Connect Publications, Elsevier
  • Building a Framework to Embrace the New and Expand Your Horizons
    Bruce Rosenstein, Author, Living in More Than One World: How Peter Drucker‘s Wisdom Can Inspire and Transform Your Life
  • All About E
    Peggy Garvin, Founder & Principal, GarvinInformationConsulting.com

Want more Best Practices? View the 2010 Best Practices:  The New Face of Value in PDF version

Email thisAdd to del.icio.us

TrackBack

Comments

Verify your Comment

Previewing your Comment

Posted by:
 | 

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

2011 Best Practices for Government Libraries Now Available – Government Info Pro

« 2011 ALA Annual Conference and Exhibit |
Main
| ALA 2011: In New Orleans »

2011 Best Practices for Government Libraries Now Available

The 2011 Best Practices for Government Libraries:  e-Initiatives and e-Efforts:  Expanding Our Horizons is now available:  2011 Best Practices for Government Libraries in PDFBest Practices for Government Libraries for 2011 and prior years are all available from the right sidebar here on the Government Info Pro.

Best Practices for Government Libraries 2011

Best Practices is a collaborative document that is put out annually on a specific topic of interest to government libraries and includes content submitted by government librarians and community leaders with an interest in government libraries. The 2011 edition includes over 70 articles and other submissions provided by more than 60 contributors including librarians in government agencies, courts, and the military, as well as from professional association leaders, LexisNexis Consultants, and more. As the editor of Best Practices for Government Libraries, I want to thank the contributors for sharing their knowledge, experience, and thoughtful perspectives in this year’s Best Practices. If you did not write for this year’s Best Practices, I invite you to submit a guest post for the Government Info Pro.

The 2011 Best Practices for Government Libraries:  e-Initiatives and e-Efforts:  Expanding Our Horizons is broken into six sections:

  • EMBRACING NEW AVENUES OF COMMUNICATION
  • ADAPTING TO NEW AND EVOLVNG TECHNOLOGIES
  • ALTERING OUR PLACES AND SPACES
  • TACKLING CHANGING EXPECTATIONS, RESOURCES, AND JOB DESCRIPTIONS
  • PRESERVING WHAT WE HAVE AND PREPARING FOR THE FUTURE
  • EXPANDING HORIZONS

Here is a sampling of the articles in each section:

EMBRACING NEW AVENUES OF COMMUNICATION

  • Blogging at the Largest Law Library in the World
    Christine Sellers, Legal Reference Specialist, and Andrew Weber, Legislative Information Systems Manager, Law Library of Congress
  • “Friended” by the Government? A Look at How Social Networking Tools Are Giving Americans Greater Access to Their Government
    Kate Follen, MLS, President, Monroe Information Services
  • Podcasts Get Information Junkies their Fix
    Chris Vestal, Supervisory Patent Researcher with ASRC Management Services, U.S. Patent and Trademark Office and DC/SLA‘s 2011 Communication Secretary
  • Getting the Most from Social Media from the Least Investment of Time and Energy
    Tammy Garrison, MLIS, Digitization Librarian at the Combined Arms Research Library at Fort Leavenworth, KS
  • Thinking Outside the Email Box: A New E-Newsletter for the Justice Libraries
    Kate Lanahan, Law Librarian, and Jennifer L. McMahan, Supervisory Librarian, U.S. Department of Justice
  • Bill’s Bulletin: Librarians and Court Staff Working Together to Develop an E-Resource
    Barbara Fritschel, U. S. Courts Library, Milwaukee, WI
  • Proletariat’s Speech: Foreign Language Learning with a Common Touch
    Janice P. Fridie, Law Librarian, U.S. Department of Justice
  • Social Media Comes Together with Storify
    Chris Zammarelli, Contract Cataloger on behalf of ATSG, LLC for the U.S. Department of State Bureau of International Information Programs, Office of Information Resources

ADAPTING TO NEW AND EVOLVNG TECHNOLOGIES

  • EBooks in Special Libraries: Final Report of the Federal Reserve System Libraries Work Group on EBooks
    Luke Mueller, Technical Librarian, Federal Reserve Bank of Philadelphia
  • Kindle Lending Programs in Libraries
    Montrese Hamilton, Librarian, Society for Human Resource Management
  • Intranet Case Study: Government Agency
    Lorette S.J. Weldon, MLS, BSIFSM, BA
  • Putting the E in Library
    David E. McBee, Federal Government Librarian, ww.librarybuzz.blogspot.com
  • Web E-Accessibility to Reach Full E-Audience: “Expanding Our Horizon” to Better Honor Diversity
    Ken Wheaton, Web Services Librarian, Alaska State Court System Law Library

ALTERING OUR PLACES AND SPACES

  • Embedded Librarianship and E-Initiatives: The Dynamic Duo
    Rachel Kingcade, Chief Reference & CSC Direct Support Librarian, USMC Research
  • Utilizing Electronic Databases During a Library Relocation
    George Franchois, Director, U.S. Dept. of the Interior Library
  • E-Reference at the Library of Congress
    Amber Paranick and Megan Halsband, Reference Librarians, Newspaper & Current Periodical Reading Room, Serial & Government Publications Division, Library of Congress
  • Best Practices: Telework
    Robert Farina, MSLIS, Entrepreneur, Minor Potentate of Logogrammatic Research & Analysis, Data Wrangler, etc.
  • Best Practices for Virtual Reference
    Susan Ujka Larson, MLIS
  • To Build a Virtual Embedded Information Role, Start at the Top
    Mary Talley, Owner, TalleyPartners, 2011 DC/SLA President

TACKLING CHANGING EXPECTATIONS, RESOURCES, AND JOB DESCRIPTIONS

  • Accidental Advisors: There’s GOT to Be a Better Way!
    Compiled by Nancy Faget and Jennifer McMahan (Eugenia Beh, Blane Dessy, Aimee Babcock-Ellis, Marianne Giltrud, Jessica Hernandez, Rich Louis, Virginia Sanchez)
  • I Need a Library Job: Finding and Filling a Need on the Fly
    Naomi House, Reference Librarian, Census Library
  • Rebranding the Library
    Julie Jones, Hartford Branch Librarian, U.S. Courts, Second Circuit Library
  • NIH Handheld User Group: Library-IT Collaboration
    James King, Information Architect, NIH Library
  • Cats and Dogs – Living Together: Leveraging IT Resources for Library Use>
    Sarah Mauldin, Head Librarian, Smith, Gambrell & Russell, LLP, Atlanta, GA
  • Broadband Plan and the Provision of Public Libraries
    Christian Jiménez Tomás, Information Specialist, The World Bank Law Resource
  • E-Gov Sites to Go Dark?
    Kim Schultz, Outreach Specialist at the NASA Center for AeroSpace Information, operated by Chugach Federal Solutions, Inc.
  • E-Gov on the Web: A Brief Summary of Electronic Access Through On-Line Resources
    Jennifer Klang, Head of Reference Services, Department of the Interior Library
  • Public Records Resources Online: How to Find Everything There Is to Know About “Mr./Ms. X”
    Jennifer L. McMahan, Supervisory Librarian, U.S. Department of Justice
  • The Challenge of E-Legislative History for the “51st State”
    Lisa Kosow, Law Librarian, U.S. Attorney‘s Office for the District of Columbia
  • E-Gov Resources on Native Americans and Tribal Issues
    Kathy Kelly, MSLS, C.A.
  • LexisNexis 2010 International Workplace Productivity Survey: Executive Summary of Results for Legal Professionals

PRESERVING WHAT WE HAVE AND PREPARING FOR THE FUTURE

  • Federal Libraries on the E-Horizon
    Blane K. Dessy, Executive Director, FLICC/FEDLINK, Library of Congress
  • Research Metrics: Measuring the Impact of Research
    James King, Information Architect, NIH Library
  • When I Walk Across My Library I Think…
    Edwin B. Burgess, Director, Combined Arms Research Library
  • E-Initiative Liberia: Creating a Legislative Library in the Rubble of War
    Mary Nell Bryant, M.A., M.L.S., U.S. Foreign Service Information Officer, retired
  • JustSearch at the Department of Justice
    Lila Faulkner, Diane L. Smith, and Jane Sanchez, Library Staff, U.S. Department of Justice Library Staff
  • Real Libraries, Virtual Fundraising
    Biblio Latte, Volunteer Reference Librarian, Community Virtual Library
  • Accessible Libraries: Ensuring All May Read
    Jane Caulton, Head, Publications and Media Section, NLS, Library of Congress
  • A Model Lessons Learned System – The US Army
    Nancy M. Dixon, Principal Researcher, Common Knowledge Associates

EXPANDING HORIZONS

  • Ten Scary Issues: Future Directions for Military Libraries
    Edwin B. Burgess, Director, Combined Arms Research Library
  • Future Ready 365
    Cindy Romaine, SLA President 2011
  • Expanding Horizons with E-Learning: VA Librarians Develop Online Tutorial for EBN Training
    Priscilla L. Stephenson, MSLS, MSEd, Philadelphia VA Medical Center, Philadelphia, PA; and Teresa R. Coady, MLS, VA Central Iowa Healthcare System, Des Moines, IA
  • Library Connect Newsletter: Information Industry Explorations by and for Librarians
    Colleen DeLory, Editor, Library Connect Publications, Elsevier
  • Building a Framework to Embrace the New and Expand Your Horizons
    Bruce Rosenstein, Author, Living in More Than One World: How Peter Drucker‘s Wisdom Can Inspire and Transform Your Life
  • All About E
    Peggy Garvin, Founder & Principal, GarvinInformationConsulting.com

Want more Best Practices? View the 2010 Best Practices:  The New Face of Value in PDF version

Email thisAdd to del.icio.us

TrackBack

Comments

Verify your Comment

Previewing your Comment

Posted by:
 | 

This is only a preview. Your comment has not yet been posted.

Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Doing Good With Data – Data Without Borders | jake.porway

Update here: http://jakeporway.com/2011/06/data-without-borders-huh-people-like-this-idea/

As we all know, the world is inundated with data about practically everything we do, from where we are to who we know to what we eat, and it’s an extremely exciting time to be working in a field trying to make sense of all of it. However, as I and others have pointed out, there’s a lot of effort in our discipline put toward what I feel are sort of “bourgeois” applications of data science, such as using complex machine learning algorithms and rich datasets not to enhance communication or improve the government, but instead to let people know that there’s a 5% deal on an iPad within a 1 mile radius of where they are. In my opinion, these applications bring vanishingly small incremental improvements to lives that are arguably already pretty awesome.

On the other hand there are lots of NGOs and non-profits out there doing wonderful things for the world, from rehabilitating criminals, to battling hunger, to providing clean drinking water. However, they’re increasingly finding themselves with more and more data about their practices, their clients, and their missions that they don’t have the resources or budgets to analyze. At the same time, the data /dev communities love hacking together weekend projects where we play with new datasets or build helpful scripts, but they usually just culminate in a blog post or some Twitter buzz. Wouldn’t it be rad if we could get these two sides together?

Read the rest here: jakeporway.com

Doing Good With Data – Data Without Borders | jake.porway

Update here: http://jakeporway.com/2011/06/data-without-borders-huh-people-like-this-idea/

As we all know, the world is inundated with data about practically everything we do, from where we are to who we know to what we eat, and it’s an extremely exciting time to be working in a field trying to make sense of all of it. However, as I and others have pointed out, there’s a lot of effort in our discipline put toward what I feel are sort of “bourgeois” applications of data science, such as using complex machine learning algorithms and rich datasets not to enhance communication or improve the government, but instead to let people know that there’s a 5% deal on an iPad within a 1 mile radius of where they are. In my opinion, these applications bring vanishingly small incremental improvements to lives that are arguably already pretty awesome.

On the other hand there are lots of NGOs and non-profits out there doing wonderful things for the world, from rehabilitating criminals, to battling hunger, to providing clean drinking water. However, they’re increasingly finding themselves with more and more data about their practices, their clients, and their missions that they don’t have the resources or budgets to analyze. At the same time, the data /dev communities love hacking together weekend projects where we play with new datasets or build helpful scripts, but they usually just culminate in a blog post or some Twitter buzz. Wouldn’t it be rad if we could get these two sides together?

Read the rest here: jakeporway.com

AfDB and AidData launch interactive aid map

children school cameroon credit bookaidinternational

AfDB and AidData launch interactive aid map

0 Comments

Published: 17/06/2011

ADB and AidData launch interactive map

The African Development Bank (AfDB) has launched an interactive map detailing the exact locations of its projects in several countries. Development Loop, a partnership between the AfDB and AidData, maps projects in Cameroon, Morocco and Tanzania. These geocoded AfDB activities, which represent a subset of nearly 2,040 activities from the Bank, cover school builds, health clinics and roads, amongst others.

The value of this new project is that is makes the Bank’s work more transparent, enabling people to see where their projects are based and decreasing the chance of overlap.

Simon Mizrahi, manager of AfDB’s Results and Quality Assurance Department says:

The map makes it easy to see where the Bank is working, not just in which countries, but in which regions or towns. Through this partnership, we’ve been able to efficiently translate information which existed primarily in long documents into a simple visual tool for decision makers and the public at large to quickly understand what we do, and where. This is a critical step toward being able to ask the right questions about whether aid is going to the right places and what impact it has.

It will also help the AfDB to evaluate their current programmes and plan for future ones. What’s more, it enables citizens and other aid agencies to better understand where aid is going, whilst also allowing feedback from those affected by projects.

For more information about the maps and to find out how they were created visit Development Gateway’s website.

Visualizing Data with OnTheMap for Emergency Management – Random Samplings #EMCampNM

The Census Bureau released OnTheMap for Emergency Management Version 2.0 today, the beginning of the 2011 hurricane season. Version 2.0 adds floods and wildfires to hurricanes which was released last year.

OnTheMapUSA

OnTheMap for Emergency Management is a public data tool that provides unique detail on the workforce, for U.S. areas affected by hurricanes, floods, and wildfires, in real time. The web-based tool provides an intuitive interface for viewing the location and extent of current and forecasted emergency events on a map, and allows users to easily retrieve detailed reports containing labor market characteristics for these areas. The reports provide the number and location of jobs, industry type, worker age and earnings. Worker race, ethnicity, and educational attainment levels are under a beta release at this time.

To provide users with the latest information available, OnTheMap for Emergency Management automatically incorporates real time data updates from the National Weather Service, Departments of Interior and Agriculture, and other agencies for hurricanes, floods, and wildfires.

Version 2.0 includes the following new features and enhancements:

  • Real Time Updates for Hurricanes, Floods and Wildfires
  •  Expanded Hurricane Detail (Forecast Area, Current Wind Radii, Wind History)
  • Expanded Reports with New Beta Data on Demographic Characteristics (Race, Ethnicity, Educational Attainment)
  • Improved User Interface with New Map, Navigation, and Search Tools Map
  • Animation & Timeline for Viewing Daily Event Histories

Version 2.0 also includes updated help documentation including a getting started guide, system requirements, and frequently asked questions. Data for the District of Columbia, Massachusetts, and New Hampshire are not available at this time.

OnTheMap for Emergency Management is supported by the state Labor Market Information agencies under the Local Employment Dynamics (LED) partnership with the Census Bureau.

OnTheMap for Emergency Management Version 2.0 can be accessed by selecting “Local Employment Dynamics” at http://www.census.gov and then OnTheMap for Emergency Management under Quick Links, or directly at http://lehdmap.did.census.gov/em.html.

What do you think of the new version of OnTheMap? Post your comments here.