Tech News

Do large-language models dreamed Agents?

During sleep, The human brain organizes a variety of memories, combining essential while discarding those no matter. What if AI can do the same?

Bilt, a company that gives local shopping deals and restauration DEASSERS, recently sent several million agents hoping to do so.

The Bilt uses technology from the beginning of a Letta that allows agents to learn from the past conversations and share each other’s memories. Stieptime Compute, “agents determines which information is to be stored in its long-term memory memory.

“We can make one review in [memory] Andrew Fitz, who will work in hundreds of agents, “said Andrew Fitz, Ai in Bulti Engineers. This is useful in any situation where you want to control the agent’s behavior, “

Multiple Models of Unusual “Maintenance” If the information is included in the Mode window. If you want Chatbot to remember your latest conversation, you need to attach to the conversation.

Most of AI programs can only handle a limited number of information in the status window before their ability to use data falters and laugh or confused. Personal brain, comparing, is able to enter useful and remember information later.

“Your brain is progressing, adding more information like a sponge,” said Charles Packacker, Lettta’s CEO. “In the language model, it’s exactly the opposite. He uses these tongue types in the loop for long enough and the essence is poison; they want to reset.”

Packer and his cofounder Sara Woods before Memgpt, open source project that aims to help llms decide which information should be kept for short-term memory. With Letta, DUO has increased their way to allow agents to read in the background.

Bilt and Letta cooperation is part of a broad push to give AI to maintain a useful and efficient details, which could make sharp chatboxs and have agents. The memory is always distributed by modern AI, which reduces the intelligence and the trust of AI tools, according to the experts I spoke to.

Harrison chased, COOOUNDER AND CEO of Langchain, another company has developed a recent improvement in agents AI, where the user or engineer decides what information on the context. Langchain offers companies different types of agents, from long-term facts about memory users of the latest experiences. “Memory, I would argue, it is a bike way,” says Chares. “A large part of AI engineer is basically to get the model the right context [information]. “

Ai Ai AI Tools began to forget, too. This in February, Openai declared Chatgt that Chatgt would keep the relevant information to provide the users of the users – even though the company had not revealed how this applies.

Letta and Langchain makes a public reception process that engineers make AI systems.

“I think it’s very important not only that models are open but also Memory programs that will be opened,” said Ai Hosting Platform Manager and Detta investor.

By defrauding, the Letta CEO PACKER means that it is also important in AI models to learn what to forget. “When the user says, ‘That one project we worked, wiped it out of your memory’ then agent should be able to go back and rewrite one memory.”

Vision of memories of artificial and dreams makes me think Are Androids dream of electric sheep? Philipi K. Dick, there is a mind that bends a mind to breathe the magnificent dystopian movie Blade runner. Large language models are not impressed as rebellious opponents of the story, but their memories, they seem to be weak.


This is a program of Knight’s Ai Lab Newsletter. Read the books of the past Here.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button