So, now let's cover some analytic solutions in the government and the public sector. As you can imagine, there are a range of agencies and ministries and departments at a national level, at a local level, that include things like defense, education, public services, public safety, legal issues, welfare, housing, commerce, trade, and economy, and transportation. For the most part, agencies and departments and so forth are mostly concerned with improving services and citizen engagement, with reducing fraud and ensuring compliance, improving safety, improving intelligence, reducing expenses, and, of course, collecting revenues as you might imagine. So, let's start our government examples with a really interesting and unusual example of using analytics. It was developed by Google's tech incubator called Jigsaw. It's meant to dissuade disaffected individuals from being radicalized by ISIS online. What it does is it places special ads alongside results for 1,700 special keywords and phrases that it's determined attract people to ISIS, and things that they commonly search for. The ads redirect to Arabic and English language videos on YouTube, depicting the actual horrors of ISIS, the realities of those who joined, and messages from moderate clerics. The first two months it was implemented, over 300,000 people were drawn to the anti-ISIS YouTube channels and half a million minutes of anti-ISIS videos have been watched. Click-through rates on the ads are up to four times higher than typical Google keyword advertising. In New York, there's a problem with grease. No, not the musical or the movie, but tracking grease-dumpers that account for 60 percent of the backup citywide in 7,400 miles of sewer lines. So, the city created a solution that uses data on local restaurants from the city's business integrity commission and licensed data from the Department of Health and Mental Hygiene. They use this data to compare restaurants that did not have a grease carter with geospatial data on where sewers were located. This enabled the city with a 95 percent success rate to track down grease-dumpers. It eliminated 30 million pounds of debris from sewers providing two million extra gallons of sewer capacity annually. It also reduces sewer backup related costs for businesses and homeowners. In Helsinki, the public transportation system wanted to stay competitive with private bus operators. They created a data warehouse to collect and analyze data on buses in the fleet. They installed sensors on the buses to generate four million data records each day, including the speed, braking, acceleration, idling time, engine temperature, fuel consumption, and so forth on routes carrying 60 million passengers per year. Then they use this data along with geospatial visualizations with the drivers themselves to help them figure out how to improve their own performance. The solution reduced fuel consumption by five percent, increase rider satisfaction seven percent, and reduce the cities carbon footprint. It also helps in identifying any mechanical problems that may be developing on the buses. The County of Santa Clara in California wanted to optimize investments in transportation infrastructure by improving traffic flow. They converted 107 traffic intersections to include video feeds and mapping data into actionable traffic intelligence and integrated traffic signals. The traffic data's dissected and analyzed by lane, by action, and even down to the individual vehicle. This solution has helped eliminate over 800,000 hours of travel time per year due to reduced congestion. For example, they've eliminated 18,000 fewer vehicle stops per day at 17 key expressway intersections. They've also eliminated a million gallons of fuel or three million dollars per year in fuel and reduce total emissions per year over 100 tons just by collecting data, analyzing, and optimizing traffic flow. Some of the more interesting analytic solutions come out of universities themselves. For example, the University of Warwick in the UK created a solution to estimate crowd sizes for crowd control and safety. They use the tweets sent associated with geolocation and timestamps for a designated area. They also use the volume of mobile calls, messages, and Internet connections within a particular grid. They can analyze the size of activity spikes versus baseline activity and calculate crowd sizes to within five percent at any moment. This gives the public safety personnel the ability to facilitate evacuations and control crowds to avoid disasters. Back in the city of New York, the New York City Fire Department used to analyze fire risks building by building somewhat haphazardly. Today, they have an algorithm that analyzes 2,400 different factors from over 300,000 commercial and public buildings to determine a risk score that guides inspectors to prioritize certain buildings and their likely fire safety issues. This new solution has achieved a 70 percent success in identifying fire hazards in buildings. It can reduce fires and other safety-related events, save on personnel and firefighting resources, and reduce insurance claims. Now, let's fly overseas to the city of Bolzano in Northern Italy. They have an issue with an aging population that wants to stay in their homes. So, they created a smart city network of home sensors to monitor for temperature, carbon monoxide levels, water, and energy usage, and analytics to determine normal household behavior and also identify aberrations and alert family or neighbors if something has gone awry. As a result, they've been able to lower assistance and care cost by 30 percent, enable more retirees to remain in their homes, and reduce the need for the city to build assisted living facilities. Remember when I talked about adapting analytics from one domain to another? This is one of my favorite examples. In the city of Los Angeles, they realized that crimes follow a pattern much like earthquake aftershocks. So, they applied this model for predicting earthquake aftershocks to historical crime and other factors. The Los Angeles Police Department is now able to predict twice as many crimes as experienced crime analysts in controlled trials. They have a 33 percent reduction in burglaries and a 21 percent reduction in violent crimes in the test region of Los Angeles compared to a slight increase in the rest of the city. In Pinellas County, Florida, they wanted more efficient spending for at-risk kids. So they integrated child-relevant datasets from different systems such as schools, child welfare, juvenile justice, and impact data. From that, they created a youth at risk index which identifies how children have used Juvenile Welfare Board funded programs throughout their lifetime and gauge the impact of these programs. This enables them to innovate policies and optimize budgets to predict where and model the likely outcomes when adding resources to a particular program, and they've also created a mobile app for community and staff to use. Another great policing example comes to us from the Swedish Police Force. Some years ago, a serial killer was terrorizing the city of Malmö in Sweden. So, the city implemented a system where they're integrating communication behavior from phone calls, crime statistics, weather, day and weekly city events with data from over 500,000 interrogations, other evidence, and background. In doing so, they were able to reduce nine months of manual analysis to three minutes of automated analytics and help locate the serial killer. Another one of my favorite examples also came out of a university, this time the University of Rochester. They created a system which is able to determine which restaurants are prone to serving up foodborne illnesses. Researchers analyzed nearly four million tweets from 90,000 users tweeting about or while at 23,000 New York City area restaurants. The machine-learning algorithm they created then follow these users for 72 hours to look for mentions of terms related to nausea. Over four months, the system spotted 480 reports of possible food poisoning, which correlated with the poor health departments scores for these actual restaurants. This system could give the New York City Department of Health the ability to proactively investigate restaurants. The US Department of Human Services wanted to improve its ability to detect fraudulent welfare payments and replace its manual data matching process with a Bigdata platform. It uses an automated data-sharing protocol and maps welfare recipients with unexplained wealth to detect anomalies. Its first year of implementation resulted in $25 million of savings. It's also being used in large-scale statistical risk modeling in the Anti Money-Laundering and Counter-Terrorism financing space. A solution created by the Amsterdam Science Park is helping to save the rhinos. They tag and track the movements of neighboring animals, like zebra, wildebeest, and antelope as they respond to the presence of human intruders anywhere within the 135 square mile African reserve. The system monitors and collects information related to the animal's location, their movement, their direction, and average speed of travel along with other data. Analytic models help understand the movement patterns of neighboring animals and alert rangers to what kind of response might be indicated. As a result, they've seen a 15 percent decline in poaching in the first two years of implementation, following a 9,000 percent increase in the previous seven years. The city of Beijing created a system for the real-time tracking and prediction of rail transit passenger flow to improve operations service by leveraging 100 million travel records per month. The solution which helps them build millions of models in just hours can help enhance the capacity of public transportation. It achieved a 90 percent accuracy rating to predict the flow, volume, and provide the capacity, one week ahead of schedule for concerts, sports competitions, and other mass travel planning.