News

Samsung Boasts Exynos 2400 GPU Performance, Exynos 2500 to Be 3nm Chip

Spread the love

Samsung has introduced the Exynos 2400 at the System LSI Tech event. Comparing the Exynos 2200, Samsung claims that the former offers a 1.7x boost in CPU performance and a 14.7x rise in AI performance. It was also said that even without an internet connection, the new chip can generate text-to-images through on-device generative AI capabilities. Its competitor, the Snapdragon 8 Gen 3, was also introduced by Qualcomm. The Exynos 2400 will power up the base and the Plus variants of the Galaxy S24 series. Whereas the Snapdragon 8 Gen 3 will be powering up the Galaxy S24 Ultra.

It will be quite interesting to compare both the newly launched chipsets and declare which performs better. Reacting to these statements from the reviewers, Samsung has quite great confidence in the way its new CPU performs. Samsung has specifically stated that the Exynos 2400 comes with a better GPU than its competitors. At the Semiconductor EXPO 2023 keynote held at COEX in Gangnam-gu, Seoul, Park Yong-in, President of System LSI commented, It will do well because it has better GPU (graphics processing unit) performance than competitors.” He said that they have great hopes for the Exynos 2400. It seems that even if the Exynos 2400 is tested with the Snapdragon 8 Gen 3, the former will perform better.

It’s coming—the Exynos 2500—and will be produced on a 3nm process

Park Yong-in also declared that the 3nm fabrication process will be used for the flagship Exynos chip, which will debut the following year. He didn’t mention the name of the upcoming processor, but, however, we can expect it to be named the Exynos 2500. This will be produced using Samsung Foundry’s recently developed 3nm GAA fabrication process. This will result in better performance and power efficiency than TSMC’s 3nm process. Park discloses that the NPUs will be used in place of GPUs for future AI processing. The reason for this is AI and generative AI applications. GPUs are currently used in cloud servers, but they are expensive and use a lot of power. He concluded that NPU will be more suitable for AI use cases.

Related Articles

Back to top button