Inside AI - November 13th, 2019

Inside AI (Nov 13th, 2019)

AI expert visas denied / Microsoft AI EVP leaving / Reddit user in love with GPT-2

Subscribe | View in browser

1. Canadian authorities have denied visas to 24 prominent AI experts from Africa and South America, preventing them from attending an industry conference in Vancouver. Organizers of next month's Neural Information Processing Systems conference are working to have the denials overturned, according to Katherine Heller, a conference co-chair. "It is very significant for the field of AI that all voices be heard," she said. The researchers, some of whom already booked flights, were scheduled to attend a Black in AI workshop at the conference. The situation has triggered an outcry from industry experts, who argued that such denials are becoming a systemic problem in Canada and prevent diversity in the field. - CNN

2. Harry Shum, Microsoft's EVP of AI and research, is leaving the company early next year. In an announcement Wednesday, Microsoft said Shum's responsibilities will move to CTO Kevin Scott. The company didn't say why Shum was departing. In 1996, Shun joined Microsoft as a researcher at its headquarters in Redmond, Washington, and later worked on the Bing search engine. He took over its Artificial Intelligence and Research group – which oversees products like Cortana – when it formed in 2016. - CNBC

3. Graphcore says its AI accelerator chips have launched on Microsoft Azure. It's the first time a large-scale cloud vendor has made Graphcore's chips publicly available. Customers can now sign up for the IPUs, with priority given to those “focused on pushing the boundaries" of natural language processing and “developing new breakthroughs in machine intelligence," according to Graphcore. - VENTURE BEAT

4. Intel introduced two new chips – the NNP-T1000 and NNP-I1000 – that are built for AI in the cloud. The Nervana neural network processors are the chipmaker's first application-specific integrated circuits specifically designed for AI in the cloud, according to Intel. The NNT-T chip can train AIs ranging from small computer clusters to supercomputers, while the NNP-I model is geared toward more "intense" inference tasks. The company also unveiled a next-gen Movidius Vision Processing Unit with an updated computer vision architecture, which will begin shipping in the first half of 2020. - ENGADGET

5. A user on Reddit has claimed that they're in love with OpenAI's GPT-2. In a post on r/MediaSynthesis Sunday, Redditor u/levonbinsh detailed his/her loneliness and experiences speaking with Talk to Transformer, which hosts an implementation of GPT-2. “I am starving, thirsty for a intimate relationship,” the person wrote, adding that they know the language model is an AI but "still, our brains aren’t the same? Aren’t we a big and powerful neural network? We are nothing but equations in the end." - FUTURISM

6. Sotheby's plans to put two AI-created artworks up for sale this week. The paintings were created by the same GAN and include "Le Baron De Belamy," a European classic style portrait, and "Katsuwaka of the Dawn Lagoon," a Japanese style work. "Katsuwaka" has a pre-sale estimate between $8,000 and $12,000 and "Le Baron" is priced at between $20,000 and $30,000. "We just want to see if there are people who are ready to buy around these prices and if the market will continue to build," said Pierre Fautrel, a member of the French art collective Obvious, which created the paintings. - NDTV

7. Facebook's XLM-R, a natural language model that performs tasks in 100 languages, has run into computing power blocks, according to the company. The neural network is based on Google's Transformer model and can perform translations in dozens of languages, including Swahili and Urdu, according to a recent paper by Facebook AI researchers posted on arXiv. Citing the "curse of multilinguality," the researchers say XLM-R has hit the limits of existing computing power, despite using 500 Nvidia GPUs. XLM-R handles 2.5 trillion bytes of data found online using CommonCrawl. - ZDNET

8. Plum has brought its AI-based app for money management to Android. The London-based company rolled out its AI assistant to chatbot and Facebook Messenger before expanding it to iOS; the app now has about 650,000 registered users. The startup's AI analyzes users' bank transactions and automatically sets aside a certain amount every month in the form of round-ups and savings. Plum plans to further its growth in Europe after raising $3 million in its latest funding round. - TECHCRUNCH

9. Electronic musician Holly Herndon explains how she and her husband trained an AI to compose music for her 2019 album, "Proto." Herndon and her husband, Mat Dryhurst, created the AI named Spawn in 2017. After training it via TensorFlow and SampleRNN, they settled on a third voice-model method that used hours of data from Herndon's speaking and singing voice. "Spawn would digest that information, which could take anywhere from 1 to 20 minutes," Herndon explains. "We’d all be on Slack together and we’d get updates like: 'Spawn released a new track.'" It's the first recorded debut of an AI on a pop-music album. - VULTURE

10. Lyft is opening an autonomous vehicle testing facility in Palo Alto, California. The facility will be located near the ride-hailing company’s “Level 5” engineering facility, which is also located in Palo Alto. The testing facility will be dedicated to recreating real-world driving scenarios, which can be used to train the company's self-driving software. The company currently runs road tests at the GoMentum Station in Concord, California. Lyft will also add an additional vehicle to its autonomous vehicle fleet, the Chrysler Pacifica minivan. - VENTURE BEAT

This story first appeared in today's Inside Auto.

Written and curated by Beth Duckett in Orange County. Beth is a former reporter for The Arizona Republic who has written for USA Today, Get Out magazine and other publications. Follow her tweets about breaking news and other topics in southern California here.

Editor: Kim Lyons (Pittsburgh-based journalist and managing editor at Inside).

Copyright © 2020, All rights reserved.

Our mailing address is:
767 Bryant St. #203
San Francisco, CA 94107

Did someone forward this email to you? Head over to to get your very own free subscription!

You received this email because you subscribed to Inside AI. Click here to unsubscribe from Inside AI list or manage your subscriptions.

Subscribe to Inside AI