In 1989, when the Internet was still cloaked in the relative obscurity of ARPANet, Richard Saul Wurman (founder of the TED talks) published a book that became my bible, Information Anxiety. I was an information broker at the time searching the cavernous online databases of Dialog and LexisNexis providing businesses with competitive intelligence. The tagline for my company was “Information you need, when you need it, and not a minute beforehand.” Wurman’s book deftly communicated the tension fraught by the exponential increase of information in the late 20th century, underscoring my clients’ fears of getting caught behind the information curve. He stated that: “Information anxiety is produced by the ever-widening gap between what we understand and what we think we should understand. It is the black hole between data and knowledge, and it happens when information doesn't tell us what we want or need to know.” Eric Sweden, Program Director Enterprise Architecture & Governance for NASCIO shared similar thoughts in his article Is Big Data a Big Deal for State Governments, his contribution to NASCIO’s excellent series on analytics. “Data by itself is of little value until it is turned into information, generates knowledge, and enables wisdom in making decisions which in turn results in outcomes. In the case of state government, most important are citizen outcomes.” From Sweden’s vantage, the drive for analytics is what is spurring the passion around big data. “MIT and IBM conducted surveys in 2010 and 2011 with the objective of gaining perspectives on the burgeoning demand for analytics. There was a 50% increase in the number of respondents to the 2011 survey (4,500 executives and analysts). Moreover, there was a 57% jump in the number of respondents that believe analytics provide ‘substantial’ or ‘significant’ contribution to effectiveness.” A 2012 – 2017 market sizing report from Wikibon admits that what they thought was a hype that was trending downward is no such thing: “Factory revenue generated by the sale of Big Data-related hardware, software and services took a major step forward in 2012, growing by 59% over 2011(a. The total Big Data market reached $11.4 billion in 2012, ahead of Wikibon’s 2011 forecast. The Big Data market is projected to reach $18.1 billion in 2013, an annual growth of 61%. This puts it on pace to exceed $47 billion by 2017. That translates to a 31% compound annual growth rate over the five year period 2012-2017.” The analytics bandwagon will have a big impact on government procurement trends. Based on the Wikibon’s findings and McKinsey Global Institute 2011 report, demand will be created for people, skills and technology, training and services. In 2011 McKinsey warned of a shortage of 140,000 to 190,000 people with sophisticated analytical skills – as well as a shortage of 1.5 million managers and analysts to analyze data and make decisions based on analytics. Procurement requests for these talents are beginning to appear on Onvia in greater frequency. In a recent bid for a “Data Analytics and Visualization Service”, NASA’s Ames Research Center sought to: Procure text-data analytics and visualization services to better understand and visualize the trends and discussions taking place in industry, academics, and government in areas of commercial space technologies and industries that are of interest to NASA. The Data Scientist shall run queries on their company software and help analyze the results in-person or over the phone and provide visualization of the analysis. The software that the Data Scientist is running should be able to: The data scientist should also have access to personnel within their company that are experts in data science, machine learning, algorithm design, computer science, applied physics, and many other technical fields. Onvia sits on a treasure trove of 12 years of government procurement “big data”. Our next challenge will be an analytical window into this vast resource.