Today, we were invited at the Google Cloud HQ in Sunnyvale to hear more about the new AI for Social Good Program launched by Google and presented by Jeff Dean, Google Senior Fellow, SVP, Google AI.
These days, AI is one of the hottest topics in tech alongside Blockchain and a major source of anxiety, since it might destroy millions of jobs, without the certainty to replace all of them with more creative and less painful wage-generating occupations. Another dark side of AI that is intensely debated is its potential to be harmful to humanity, such as building AI-driven weapons (which Google stopped doing recently), Microsoft’s Twitter bot Tay turning misogynistic and racist, or Facebook’s misuse of user data and plan to sell predicted behaviors to advertisers.
To start with some positive AI vibes, Jeff Dean showcased the use of AI in detecting exoplanets across 40 billion data points from 200,000 stars collected by Kepler space telescope in 4 years (2009-2013). Google machine-learning technology helped to make Kepler even better, helping NASA discover two new exoplanets in 2017 – Kepler 90-I and Kepler 80g – on top of the 3,000+ the spacecraft already found. Using Google’s technology, researchers trained a deep neural network to recognize weak-lighting signal patterns, typical of exoplanets, not detected by conventional software which are only able to decipher stronger signals.
Launching the AI for Social Good Program might be Google’s attempt to demonstrate that AI technologies can be used for hugely beneficial activities, and will enhance humans rather than replacing them with robots. Coming from Google, this is a bold move, knowing the company’s love for robots.
The new program aims at focusing Google’s AI expertise on solving humanitarian and environmental challenges while empowering the ecosystem with tools and resources to help develop solutions. During the presentation, spokesperson Jeff Dean insisted on Google’s overall commitment to developing AI responsibly, according to 7 principles, and the official refusal to pursue AI applications that can cause harm – read the document here.
Part of the initiative and announced by Jacqueline Fuller, President, Google.org, the Google AI Impact challenge, is a global call for organizations from around the world to submit proposals on how they could use AI to solve some of the world’s most critical challenges. Google will provide selected projects with AI expertise and Google.org funding from a $25 million pool, and credit and consulting from Google Cloud.
Applications just opened today, and the winners will be announced at Google I/O 2019. Organizations can find educational resources to get started with AI in Google’s guide of educational resources and in upcoming live sessions that the internet giant will hold around the world.
Demo: Flood Forecasting
Google’s presenters showcased a few existing projects that are now included in the AI for Social Good initiative, such as the flood forecasting application that provides more accurate flood predictions in the most vulnerable areas in the world, starting with the Patna region in India.
Demo: Acoustic Detection of Humpback Whales Using a Convolutional Neural Network
Other demos included the Acoustic Detection of Humpback Whales using a convolutional neural network, a project that aims to protect this endangered species by better understanding its migration patterns and breeding locations. A collaboration with NOAA’s (National Oceanic and Atmospheric Administration), the research was unveiled at today’s Google event. It analyzed 170,000 hours of ocean audio recordings (collected in the past 15 years) using AI and machine learning to locate the humpback whales accurately, to help inform maritime companies of their geographical locations and reduce collisions.
According to Google, it would have taken 19 years for someone to listen to it 24h per day. Google teamed up with NOAA to train a deep neural network to automatically identify which whale species are calling in the recordings, starting with the Humpback Whales, which are more difficult to recognize, due to the complexity of the sounds they produce.
The Google team had to turn the ocean sound recordings into a visual representation of the audio data called a spectrogram. Then, they trained the algorithm to recognize the correct species by showing it multiple spectrograms of the humpback whales. See below an example of the spectrograms:
Demo: AI-driven microscope for cancer detection
One of the top demo at the event was the AI-driven microscope assistive technology. The idea is to feed to a computer the same view as what the doctor is looking at in the microscope. The AI will detect would-be tumorous cells and overlay their location by tracing a colored contour for the medical staff to look more closely.
The goal is to save the doctors time by having them focus on interesting areas, instead of scanning perfectly healthy tissues in search of a problem. The human remains in control of where and what to look at but gets a boost from the artificial intelligence.
The workflow remains identical for the human operator, and the analysis and color highlights are done in real-time, at about ~5 frames per second, which is enough for a real interactive process. A single powerful PC, with one GPU, can do such work. In the future, faster computers can easily replace the current ones, if needed.
Demo: Global Fishing Watch
After the presentations, we met Brian Sullivan (Geo for Good, Sr. Program Manager) who explained to us how Google’s technology helps monitor fishing patterns on our planet for Global Fishing Watch, an independent non-profit organization they co-founded with Oceana, an international ocean conservation organization and Skytruth, a team of experts in using satellite data to protect the environment.
By making publicly available the data, tools and near real-time tracking of global fishing activities, Global Fishing Watch hopes to improve ocean sustainability. In that partnership, Google provides the systems for processing big data, including some AI (machine learning) to study transshipment patterns for example.
Besides transshipments, the Global Fishing Watch system monitors individual fishing ship navigation patterns using signal data coming from its Automatic Identification System (AIS).
Most importantly you can highlight or turn off different data set on the organization’s interactive maps, to see fishing patterns by countries’ fleets and fishing locations by species and by country: see photo below with Mexico’s tuna fishing territory. In that example, the Mexican fishing territory mapped by Global Fishing Watch helped to convince the fisherman’s that restricting a small area in their huge territory would not harm the industry and the jobs, according to Google spokesperson.
Visit the Global Fishing Watch map that has been recently updated with new features (October 25th) to visualize the data yourself.