Smartphones on you Artificial intelligence applications They can be used at high speeds that are increasingly being used Local data processingInstead of relying solely on communicating with servers via Internet connections. Android device manufacturers have already announced the developments, as reported by Android Authority apple However, nothing has been heard about this yet. But now the company's developers have published a specialized article on this topic.
Increase performance several times
The article explains how large language models (LLMs) are local Computational capabilities With limited memory To be able to use. Apple devices are usually fairly economical when it comes to RAM. Apple developers have found technologies… Significant increase in performance When dealing directly with artificial intelligence applications. We're talking about a 4-5x and 20-25x increase in CPU and GPU compared to traditional loading.
➤ Read more: Artificial intelligence predicts when you will die
Older iPhone generations included
The results may mean that you don't necessarily need the latest smartphones to chat with ChatGPT or other LLMs Response times To get those in almost one Natural conversation You will agree. Users of older iPhone generations could also benefit from Apple's findings in the future.
Lifelong foodaholic. Professional twitter expert. Organizer. Award-winning internet geek. Coffee advocate.