This article is more than 1 year old

Facebook's mega-chatbot has 'a persona, discusses nearly any topic, shows empathy.' Perfect for CEO version 2

Plus: OpenAI's music-making software isn't half bad ... and Banjo boss's KKK shame

Roundup Welcome to another summary of AI-related news beyond what we've already covered.

Blender is Facebook’s largest chatbot yet: Facebook eggheads have built their largest chatbot to date, packing in 9.6 billion parameters.

Blender was created by training large transformer-based neural networks on 1.5 billion comments scraped from Reddit message boards. The model was then fine tuned using datasets of question-and-answer dialogues between people getting to know each other as well as more emotional conversations, and wider discussions of interest.

By utilizing these various types of texts, Blender supposedly has “the ability to assume a persona, discuss nearly any topic, and show empathy," we're told. Facebook reckons the chatty model “feels more human” than previous chatbots.

“Conversation is an art that we practice every day — when we’re debating food options, deciding the best movie to watch after dinner, or just discussing current events to broaden our worldview,” the team's Stephen Roller, Jason Weston, and Emily Dinan wrote.

“For decades, AI researchers have been working on building an AI system that can converse as well as humans can: asking and answering a wide range of questions, displaying knowledge, and being empathetic, personable, engaging, serious, or fun, as circumstances dictate.

“So far, systems have excelled primarily at specialized, preprogrammed tasks, like booking a flight. But truly intelligent, human-level AI systems must effortlessly understand the broader context of the conversation and how specific topics relate to each other.”

Blender is able to maintain some level of coherency during conversations, and does ask and answer questions appropriately over the course of roughly 14 input-response turns. Once the dicussion goes beyond that point, it starts to fall apart. In some examples, as detailed in this paper on arXiv, Blender sometimes repeats itself, ignores questions, or just spits out false information – something the researchers describe as “hallucinating knowledge.”

Open-ended chatbots are fascinating for research; the lack of sustained chatter limits their practicality, though. After a while, you're very aware you're talking to an algorithm. Having said that, quite a few humans aren't any good at small talk, either.

There probably exists a sweet spot in which voice-controlled AI assistants using this technology can carry out useful tasks, such as setting alarms or playing music, using free and natural language rather than relying on identifying and acting on particular verbs and nouns and responding with pre-programmed lines.

The team has published its open-source code for Blender here.

What happens when you let computers make music: OpenAI’s Jukebox is machine-learning software that "generates music, including rudimentary singing."

You can listen to same output samples, and read about how it all works, here. All the audio, including the singing, was created by a neural network. It does rock, pop, blues, and more. The lyrics were co-written by OpenAI's brainiacs and an algorithm separate from Jukebox.

Jukebox has learned patterns and structure in music so that its output sounds pretty convincing, and not garbled machine-generated noise. “A typical four-minute song at CD quality has over 10 million timesteps,” OpenAI explained. “For comparison, GPT-2 had 1,000 timesteps and OpenAI Five took tens of thousands of timesteps per game. Thus, to learn the high level semantics of music, a model would have to deal with extremely long-range dependencies.”

Jukebox was trained on 1.2 million songs alongside their corresponding lyrics and metadata describing the artist and genre. That data helps the model create songs of a particular style or modeled to sound like a specific artist, as required.

Some genres are trickier than others for the software. For example, the fast pace of rapping can throw the model off track.

The AI-generated tunes are some of the most convincing examples so far, though they’re not as good as human-made music yet. “While the generated songs show local musical coherence, follow traditional chord patterns, and can even feature impressive solos, we do not hear familiar larger musical structures such as choruses that repeat,” OpenAI said. Jukebox is also very computationally intensive and it takes nine hours just to render one minute of a song.

There are more than 7,000 Jukebox songs across various genres from rock to rap that you can listen to here.

CEO of AI surveillance company was a white supremacist: Banjo – the US startup that uses AI to scan and analyze real-time feeds of surveillance cameras, social media posts, and so on – has temporarily halted all of its government contract work in Utah.

This comes after its CEO was revealed to have had ties with neo-Nazis. The attorney general of Utah launched an investigation into the upstart, and paused the state's $20m contract with the biz. The University of Utah also suspended its contract with the company.

As a 17-year-old, Damien Patton, now founder and chief exec of Banjo, was the driver in a 1990 drive-by shooting of a synagogue in Nashville, Texas, with members of the Ku Klux Klan. No one was killed, and a window was shot out.

He pleaded guilty to juvenile delinquency, while two KKK members were charged with federal hate crimes, according to an investigation by Matt Stroud for OneZero. “We believe that the Blacks and the Jews are taking over America, and it’s our job to take America back for the White race,” Patton said during his trial.

Banjo had entered into a five-year $20m contract with Utah to analyze data in real-time from the state’s CCTV cameras, traffic cams, 911 calls, as well as recordings of city council meetings, and automatically flag up in-progress crimes to officers. Now, that contract has been paused. Utah’s Attorney General Sean Reyes has hired an outside team to perform an audit into Banjo’s technology to make sure its algorithms aren’t being misused and to look for any potential biases or security issues.

“Following yesterday’s announcement by the Utah Attorney General’s Office, Banjo has decided to suspend all Utah contracts by not ingesting any government data or providing any services to government entities until an independent third party audit has been contracted and completed,” the company said in a statement.

“The audit will have direct oversight by the state and will look to ensure there’s no bias in the technology, that Banjo is not a surveillance company and that all data for the state is being handled per the contract.”

Patton also described his early adulthood as "a dark and despicable period in my life that I am extremely remorseful about and sorry for. I am deeply ashamed of my actions and do not hold any of these beliefs, which I find abhorrent and indefensible." ®

More about

TIP US OFF

Send us news


Other stories you might like