Google opened an investigation into one of its AI researchers, Margaret Mitchell, under claims that she shared thousands of internal company files with outsiders. A source told Axios that Mitchell, a co-leader of Google's ethical AI team, used automated scripts to search through messages for instances of discriminatory treatment toward AI researcher Timnit Gebru.
More:
- Gebru, Mitchell's former colleague, who was also a co-lead of Google's ethical AI team, was ousted last month after Google asked her to retract an AI research paper she co-authored. Gebru later voiced frustration at managers' requests to retract the paper, as well as its treatment of women and minorities, particularly in hiring, which generated support from colleagues and people online.
- Mitchell recently tweeted that she was documenting “current critical issues" related to Gebru's firing "point by point, inside and outside work.” She has also criticized Google's AI chief Jeffrey Dean and other company heads for ousting Gebru, who argues that she was fired rather than resigned, as Google claims.
- Google says it's investigating Mitchell after the company's systems detected that one of its accounts, presumably Mitchell's, had "exfiltrated thousands of files and shared them with multiple external accounts."
- The company has suspended the corporate account of Mitchell, who has not been fired. In a tweet, Gebru said she wonders if Mitchell is "going to get an email to her personal email accepting her 'resignation.' I have not seen a company that has this little shame in a while."
AXIOS
|
|
The U.K.'s competition watchdog is asking for people to submit examples of algorithmic misuse for its investigation into how AI systems harm online consumers. The U.K. Competition and Markets Authority (CMA) says algorithms can damage online competition by helping companies promote their products over others, limiting choices in search results, and through various other anti-competitive activities.
More:
- The CMA said most algorithms used by private firms online have little to no oversight and lack in-depth research. According to its own analysis, "more monitoring and action is required by regulators."
- Some algorithms and automated systems alter product costs, search results, or rankings, which can lead to artificially inflated prices, it noted.
- Examples include Google, Amazon, and Facebook's AI systems, which determine what people read and buy online. In searches, results are often based on previously clicked links and browsing habits, rather than closest matches.
- YouTube, for example, uses a similar system to anticipate and recommend videos a user might like.
- The CMA plans to launch "Analyzing Algorithms," a program meant to identify specific companies using algorithms that violate consumers' rights. This would help the cases be pursued in court.
- Possible regulations would force companies to divulge information about their algorithms to auditors, for example, and make changes to the systems when necessary.
ZDNET
|
|
Anthony Levandowski
Former Waymo self-driving engineer Anthony Levandowski, who founded a church for worshipping artificial intelligence, was among those pardoned this week by President Trump. Levandowski, whose Way of the Future nonprofit church believes that machines will one day outsmart humans, had been sentenced to 18 months in prison for stealing trade secrets from Google.
More:
- Levandowski, the church's founder and high priest, told Wired in 2017 that it's dedicated to "the realization, acceptance, and worship of a Godhead based on Artificial Intelligence (AI) developed through computer hardware and software."
- The church generally believes that an advanced enough AI system "would be indistinguishable from God." As ZDNet's Chris Matyszczyk notes, the church appears mostly defunct, but it does have a small following on Twitter.
- In 2019, Levandowski was indicted and later pleaded guilty on a federal charge of stealing self-driving car trade secrets while he worked at Waymo. He admitted that he downloaded more than 14,000 documents before leaving the company's self-driving division.
- Levandowski later moved to Uber, where he was fired for not complying with a court order to turn in the documents. Uber also acquired Otto, a self-driving truck startup Levandowski founded after leaving Waymo.
- In explaining the decision to pardon Levandowski, the White House said he "led Google’s efforts to create self-driving technology" and "has paid a significant price for his actions."
CNET
|
|
An AI system that combines a convolutional neural network with satellite cameras can automatically count elephants. University of Oxford and University of Bath researchers trained an algorithm to locate elephants in images provided by Maxar satellites, which could help monitor and protect the endangered species.
More:
- The AI method showed "comparable accuracy" to human detectors. It could replace the manual counting process done from low-flying planes.
- To save African elephants, for example, scientists "need to know where the animals are and how many there are," said University of Bath computer scientist Olga Isupov, who created the algorithm. Only 40,000 to 50,000 African elephants are believed to still exist in the wild.
- The AI tool is more efficient as satellites provide 5,000 km² of land images every several minutes. It also sidesteps border controls and avoids disturbing the animals.
- The research was published in the journal Remote Sensing in Ecology and Conservation.
CNET
|
|
Autonomous vehicle company Aurora has partnered with truck manufacturer Paccar on commercial self-driving trucks. Engineers from both companies will pursue an “accelerated development program” for driverless trucks, starting with Paccar's Peterbilt and Kenworth models.
More:
- The firms said they'll work on “all aspects of collaboration,” include technology upgrades and integrating the trucks with Aurora’s hardware and software.
- Once complete, Aurora plans to test the trucks on public roads and privately at Paccar’s technical center in Washington state. Paccar would sell the trucks to Aurora, which plans to deploy them in North America within "several years."
- Recently, Aurora acquired Uber’s Advanced Technologies Group, its self-driving vehicle division. Aurora also said it will begin testing driverless semi-trucks and other vehicles in Texas soon.
BLOOMBERG
|
|
According to a study by software firm Pegasystems, governments are poised to take over AI regulations from the private sector by 2025. The company hired iResearch to survey more than 1,300 executives in a dozen countries about AI, extended reality, hyperautomation, and other next-generation tech.
Some findings:
- 65% of executives thought current levels of private-sector AI governance aren't enough to manage its growth.
- 78% said that right now, they prefer full or equally shared responsibility for AI regulations.
- Five years from now, 75% said they expect governments to be largely or fully responsible.
- 53% expressed concerns that any regulation, public or private, could stifle their innovations.
- 27% said their companies currently lack an AI governance leader.
AITHORITY
|
|
QUICK HITS:
*This is sponsored content.
|
|
|
|
Beth is a former investigative reporter for The Arizona Republic who authored a book about the U.S. solar industry. A graduate of the Walter Cronkite School of Journalism, she won a First Amendment Award and a Pulitzer Prize nomination for her co-reporting on the rising costs of Arizona's taxpayer-funded pension systems.
|
|
Editor
|
Charlotte Hayes-Clemens is an editor and writer based in Vancouver. She has dabbled in both the fiction and non-fiction world, having worked at HarperCollins Publishers and more recently as a writing coach for new and self-published authors. Proper semi-colon usage is her hill to die on.
|
|