AI agents are everywhere. In fact, the next AI company to debut could well be offering AI agents as part of its portfolio to solve a certain problem. Agents are taking over customer service, democratising AI developments, and are loved by Indian founders in general.
Even as their definition evolves, we see them at the forefront in plenty of places.
Making things smoother for AI agents, two developers at the ElevenLabs London Hackathon held last week created a new protocol, GibberLink, that changes the way AI communicates with other AI.
What is GibberLink?
Boris Starkov and Anton Pidkuiko created a custom protocol that enables AI agents to recognise each other and switch to a new mode of communication, where structured data is transmitted over sound waves instead of words.
This was done to explore the limitations of traditional AI-to-AI speech and find a more optimised way to eliminate unnecessary complexity. The developers mention that plain English is not perfect for AI-to-AI conversation because it is inefficient, has relatively high compute costs of voice generation and is error-prone.
Hence, they proposed the new protocol, which is described as a sound-level protocol that uses an open-source data-over-sound library, ggwave.
This library enables you to communicate small amounts of data using sound between air-gapped devices. A simple FSK-based transmission protocol is implemented, making it easy to integrate with different projects.
How Does It Work?
Simply put, the communication protocol switches to sound-level protocol when it detects an AI agent, but if it detects a human, it sticks to speech.
In the demonstration video above, which went viral on the internet, a voice assistant is first seen talking to another. These are both ElevenLab’s Conversational AI agents in action. The first AI agent calls on behalf of a person to inquire about the hotel’s availability for a wedding. The second responds by acknowledging and clarifying that it is also an AI agent.
Next, the second agent asks the first if it wants to switch to the GibberLink mode for efficient communication. The next moment, they switch to the new protocol and speak like two machines communicating in their own secret language (something like the Morse code).
The Skynet Era Benefits
Seeing the two agents communicate with each other might look straight out of a science-fiction Terminator movie, but the developers state that there are benefits to it. To start with, GibberLink aims to avoid speech generation when two AI agents are involved, which usually contributes to 90% of the compute cost.
With that reduced, the protocol enables much faster communication of the same information, providing up to 80% improvement.
The developers also claim that this protocol allows for clearer communication in noisy environments, making it error-proof.
GibberLink’s Got People Talking
While the project was part of a hackathon experiment, it has got the internet talking about it, making us wonder what’s next. Georgi Gerganov, the creator of ggwave sound library, took to X to appreciate the demo and congratulate the developers for winning first place in the hackathon competition.
Luke Harries, from ElevenLabs, called the development “mind-blowing”.

Users on X likened it to the popular sci-fi movie Men in Black, where aliens have their own language. Memes circulated referring to a future where machines decide to kill all humans. Some even mentioned that this ability could end up being added to every other app from the model level.
The Future of AI Communication
Considering we want AI to be autonomous, an AI-to-AI communication protocol sounds like an interesting idea.
It’s intriguing for humans to witness AI assistants communicating in their own language. However, this could lead to less human involvement, prompting us to consider what type of oversight is necessary.