+353-1-416-8900REST OF WORLD
+44-20-3973-8888REST OF WORLD
1-917-300-0470EAST COAST U.S
1-800-526-8630U.S. (TOLL FREE)
New

Research Report on Automotive Memory Chip Industry and Its Impact on Foundation Models, 2025

  • PDF Icon

    Report

  • 580 Pages
  • April 2025
  • Region: China, Global
  • Research In China
  • ID: 6078205
From 2D+CNN small models to BEV+Transformer foundation models, the number of model parameters has soared, making memory a performance bottleneck.

The global automotive memory chip market is expected to be worth over USD17 billion in 2030, compared with about USD4.3 billion in 2023, with a CAGR up to 22% during the period. Automotive memory chips took an 8.2% share in automotive semiconductor value in 2023, a figured projected to rise to 17.4% in 2030, indicating a substantial increase in memory chip costs.

The main driver for the development of the automotive memory chip industry lies in the rapid rise of automotive LLMs. From the previous 2D+CNN small models to BEV+Transformer foundation models, the number of model parameters has significantly increased, leading to a surge in computing demands. CNN models typically have fewer than 10 million parameters, while foundation models (LLMs) generally range from 7 billion to 200 billion parameters. Even after distillation, automotive models can still have billions of parameters.

From a computing perspective, in BEV+Transformer foundation models, typically those with LLaMA decoder architecture, the Softmax operator plays a core role. Its weaker parallelization capability than that of traditional convolution operators makes memory the bottleneck. Especially memory-intensive models like GPT pose high requirements for memory bandwidth, and common autonomous driving SoCs on market often face the problem of "memory wall".

End-to-end essentially embeds a small LLM. With the increasing amount of data fed, the parameters of the foundation model will continue to grow. The initial model size is around 10 billion parameters, and through continuous iteration, it will eventually exceed 100 billion.

On April 15, 2025, at its AI sharing event, XPeng disclosed for the first time that it is developing XPeng World Foundation Model, a 72-billion-parameter ultra-large autonomous driving model. XPeng's experimental results show that the scaling law effect is evident in models with 1 billion, 3 billion, 7 billion, and 72 billion parameters: the larger the parameter scale, the greater the model's capabilities. For models of the same size, the more training data, the greater the model's performance.

The main bottleneck in multimodal model training is not only GPUs but also the efficiency of data access. XPeng has independently developed underlying data infrastructure (Data Infra), increasing data upload capacity by 22 times, and data bandwidth by 15 times in training. By optimizing both GPU/CPU and network I/O, the model training speed has been improved by 5 times. Currently, XPeng uses up to 20 million video clips to train its foundation model, a figure that will increase to 200 million this year.

In the future, XPeng will deploy the "XPeng World Foundation Model" to vehicles by distilling small models over the cloud. The parameter scale of automotive foundation models will only continue to grow, posing significant challenges to computing chips and memory. To address this, XPeng has self-developed Turing AI chip, which boasts a utilization 20% higher than general automotive high-performance chips and can handle foundation models with up to 30B (30 billion) parameters. In contrast, Li Auto's current VLM (Vision-Language Model) has about 2.2 billion parameters.

More model parameters often come with higher inference latency. How to solve the latency problem is crucial. It is expected that the Turing AI chip may offer big improvements in memory bandwidth through multi-channel design or advanced packaging technology, so as to support the local operation of 30B-parameter foundation models.

Memory bandwidth determines the upper limit of inference computing speed. LPDDR5X is widely adopted but still falls short. GDDR7 and HBM may be put on the agenda.

Memory bandwidth determines the upper limit of inference computing speed. Assuming a foundation model has 7 billion parameters, at INT8 precision for automotive use, it occupies 7GB of storage. Tesla's first-generation FSD chip has memory bandwidth of 63.5GB/s, meaning it generates one token every 110 milliseconds, with a frame rate of lower than 10Hz, compared with the typical image frame rate of 30Hz in the autonomous driving field.

Nvidia Orin with memory bandwidth of 204.5GB/s generates one token every 34 milliseconds (7GB ÷ 204.5GB/s = 0.0343s, about 34ms), barely reaching 30Hz (frame rate = 1 ÷ 0.0343s = 29Hz). Noticeably this only accounts for the time required for data transfer, completely ignoring the time for actual computation, so the real speed will be much lower than the data.

DRAM Selection Path (1): LPDDR5X will be widely adopted, and the LPDDR6 standard is still being formulated.

Apart from Tesla, all current automotive chips only support up to LPDDR5. The next step for the industry is to promote LPDDR5X. For example, Micron has launched a LPDDR5X + DLEP DRAM automotive solution, which has passed ISO26262 ASIL-D certification and meets critical automotive FuSa requirements.

Nvidia Thor-X already supports automotive LPDDR5X, with memory bandwidth increased to 273GB/s. It supports the LPDDR5X standard and PCIe 5.0 interface. Thor-X-Super has an astonishing memory bandwidth of 546GB/s, and utilizes 512-bit wide LPDDR5X memory to ensure extremely high data throughput. In reality, the Super version, like Apple's chip series, simply integrates two X chips into one package, but it is not expected to enter mass production in the short term.

Thor has multiple versions, with five currently known: ① Thor-Super, with 2000T computing power; ② Thor-X, with 1000T computing power; ③ Thor-S, with 700T computing power; ④ Thor-U, with 500T computing power; ⑤ Thor-Z, with 300T computing power. Lenovo's first Thor central computing unit in the world plans to adopt dual Thor-X chips.

Micron 9600MTPS LPDDR5X already has samples, targeting mobile devices, with no automotive-grade products available yet. Samsung's new LPDDR5X product, K3KL9L90DM-MHCU, empowers high performance from PCs, servers, vehicles, to emerging on-device AI applications. It delivers speeds 1.25 times faster and 25% better power efficiency compared to the previous generation, and has a maximum operating temperature of 105°C. Mass production started in early 2025. A single K3KL9L90DM-MHCU features 8GB and x32 bus, eight chips totaling 64GB.

As LPDDR5X gradually enters the era of 9600Mbps or even 10Gbps, JEDEC has started developing the next-generation LPDDR6 standard, targeting 6G communications, L4 autonomous driving, and immersive AR/VR scenarios. LPDDR6, as the next-generation memory technology, is expected to have a rate of over 10.7Gbps, even possibly up to 14.4Gbps, with improvements in both bandwidth and energy efficiency - 50% better than the current LPDDR5X. However, mass production of LPDDR6 memory may not occur until 2026. Qualcomm's next-generation flagship chip, Snapdragon 8 Elite Gen 2 (codenamed SM8850), will support LPDDR6. Automotive LPDDR6 may take even longer to arrive.

DRAM Selection Path (2): GDDR6 is already installed in vehicles but faces cost and power consumption issues. A GDDR7+LPDDR5X hybrid memory architecture may be viable.

Aside from LPDDR5X, another path is GDDR6 or GDDR7. Tesla’s second-gen FSD chip already supports first-gen GDDR6. HW4.0 uses 32GB GDDR6 (model: MT61M512M32KPA-14) running at 1750MHz (the minimum LPDDR5 frequency is also above 3200MHz). Since it is the first-gen GDDR6, its speed is relatively low. Even with GDDR6, running 10 billion-parameter foundation models smoothly remains unfeasible, though it’s currently the best available.

Tesla’s third-gen FSD chip is likely under development and may be completed in late 2025, with support for at least GDDR6X.

The next-generation GDDR7 standard was officially released in March 2024, but Samsung had already unveiled the world’s first GDDR7 in July 2023. Currently, both SK Hynix and Micron have introduced GDDR7 products. GDDR requires a special physical layer and controllers, and chips must have a built-in GDDR physical layer and controllers to use GDDR. Companies like Rambus and Synopsys sell relevant IPs.

Future autonomous driving chips may adopt hybrid memory architecture, for example, use GDDR7 for processing high-load AI tasks and LPDDR5X for low-power general computing, balancing performance and cost.

DRAM Selection Path (3): HBM2E is already deployed in L4 Robotaxis but remains far from production passenger cars. Memory chip vendors are working on migration of HBM technology from data centers to edge devices.

High bandwidth memory (HBM) is primarily used in servers. Stacking SDRAM using TSV technology increases not only the cost of the memory itself, but also the cost of TSMC's CoWoS process. Currently CoWoS capacity is tight and expensive. HBM has a much higher price than LPDDR5X, LPDDR5, and LPDDR4X commonly used in production passenger cars, and is not economical.

SK Hynix’s HBM2E is being exclusively used in Waymo’s L4 Robotaxis, offering 8GB capacity, transmission rate of 3.2Gbps, and impressive bandwidth of 410GB/s, setting a new industry benchmark.

SK Hynix is currently the only vendor capable of supplying HBMs that meet stringent AEC-Q automotive standards. SK Hynix is actively collaborating with autonomous driving solution giants like NVIDIA and Tesla to expand HBM applications from AI data centers to intelligent vehicles.

Both SK Hynix and Samsung are working to migrate HBM from data centers to edge devices like smartphones and cars. Adoption of HBMs in mobile devices will focus on improving edge AI performance and low-power design, driven by technological innovation and industry chain synergy. Cost and yield remain the primary short-term challenges, mainly involving HBM production process improvement.

Key Differences: Traditional data center HBM is a "high bandwidth, high power consumption" solution designed for high-performance computing, while on-device HBM is a "moderate bandwidth, low power consumption" solution tailored for mobile devices.

Technology Path: Traditional data center HBM relies on TSV and interposers, whereas on-device HBM achieves performance breakthroughs through packaging innovations (e.g., vertical wire bonding) and low-power DRAM technology.

For example, Samsung’s LPW DRAM (Low-Power Wide I/O DRAM) uses similar technology, offering low latency and up to 128GB/s bandwidth while consuming only 1.2pJ/b. It is expected to enter mass production during 2025-2026.

LPW DRAM significantly increases I/O interfaces by stacking LPDDR DRAM to achieve the dual goals of improving performance and reducing power consumption. Its bandwidth can exceed 200GB/s, 166% higher than LPDDR5X. Its power consumption is reduced to 1.9pJ/bit, 54% lower than LPDDR5X.

UFS 3.1 has already been widely adopted in vehicles and will gradually iterate to UFS 4.0 and UFS 5.0, while PCIe SSD will become the preferred choice for L3/L4 high-level autonomous vehicles.

At present, high-level autonomous vehicles generally adopt UFS 3.1 storage. As vehicle sensors and computing power advance, higher-specification data transmission solutions are imperative, and UFS 4.0 products will become one of the mainstream options in the future. UFS 3.1 offers a maximum speed of 2.9GB/s, which is dozens of times lower than SSD. The next-generation version UFS 4.0 will reach 4.2GB/s, providing higher speed while reducing power consumption by 30% compared to UFS 3.1. By 2027, UFS 5.0 is expected to arrive with speeds of around 10GB/s, still much lower than SSD, but with the advantages of controllable costs and a stable supply chain.

Given the strong demand for foundation models from both cockpit and autonomous driving, and to ensure sufficient performance headroom, SSD should be adopted instead of the current mainstream UFS (which is not fast enough) or eMMC (which is even slower). Automotive SSD uses the PCIe standard, which offers tremendous flexibility and potential. JESD312 defines the PCIe 4.0 standard, which actually includes multiple rates. 4 lanes is the lowest PCIe 4.0 standard, and 16-lane duplex can reach 64GB/s. PCIe 5.0 was released in 2019, doubling the signaling rate to 32GT/s, with x16 full-duplex bandwidth approaching 128GB/s.

Currently, both Micron and Samsung offer automotive-grade SSD. Samsung AM9C1 Series ranges from 128GB to 1TB, while Micron 4150AT Series comes in 220GB, 440GB, 900GB, and 1800GB capacities. The 220GB version is suitable for standalone cockpit or intelligent driving, while cockpit-driving integration requires at least 440GB.

Multi-port BGA SSD can serve as a centralized storage and computing unit in vehicles, connecting via multiple ports to SoCs for cockpit, ADAS, gateways, and more. It efficiently processes and stores different types of data in designated areas. Its benefit of independence ensures that non-core SoCs cannot access critical data without authorization, preventing interference, misidentification, or corruption of core SoC data. This maximizes data transmission isolation and independence and also reduces hardware cost of each SoC for vehicle storage.

For future L3/L4 high-level autonomous vehicles, PCIe 5.0 x4 + NVMe 2.0 will be the preferred choice for high-performance storage:

Ultra-high-speed transmission: Read speeds up to 14.5GB/s and write speeds up to 13.6GB/s, three times faster than UFS 4.0.

Low latency & high concurrency: Support higher queue depths (QD32+) for parallel processing of multiple data streams.

AI computing optimization: Combined with vehicle SoCs, accelerate AI inference computing to meet requirements of fully autonomous driving.

In autonomous driving applications, PCIe NVMe SSD can cache AI computing data, reducing memory access pressure and improving real-time processing capabilities. For example, Tesla’s FSD system uses a high-speed NVMe solution to store autonomous driving training data to enhance perception and decision-making efficiency.

Synopsys has already launched the world’s first automotive-grade PCIe 5.0 IP solution, which includes PCIe controller, security module, physical layer device (PHY), and verification IP, and complies with ISO 26262 and ISO/SAE 21434 standards. This means PCIe 5.0 will soon be available for automotive applications.

Table of Contents

1 Overview of Automotive Memory Chip Industry
  • Classification of Automotive Memory Chips
  • Three Major Categories of Memory Devices
  • Classification of Memory Chips (Semiconductor Memory)
  • Classification of Automotive Memory Chips
  • Demand Characteristics of Automotive Memory Chips
  • Global Memory Chip Market and Development Prospects for Automotive Memory
  • Global Memory Chip Market Size
  • Changes in Global Memory Chip Market: AI Drives both Memory Capacity and Performance - Changes in Global Memory Chip Market: AI Drives both Memory Capacity and Performance
  • Changes in Global Memory Chip Market: Application Scale of DRAM by Segment
  • Changes in Global Memory Chip Market: DRAM Evolves Towards Higher Bandwidth and Larger Capacity
  • Changes in Global Memory Chip Market: Significant Expansion of HBM Market
  • Changes in Global Memory Chip Market: Continuous Upgrade of HBM Technical Specifications
  • Changes in Global Memory Chip Market: NAND
  • Application Trends of Automotive Memory Chips: Huge Room for Increment in Major Application Segments
  • Application Trends of Automotive Memory Chips: Forecast of Automotive Memory Chip Output Value in 2030
  • Application Trends of Automotive Memory Chips: Capacity of DRAM and NAND Memory in Various Types of Vehicles Will Double
2 Development Trends of Automotive Memory Chips in Various Application Scenarios
  • Memory Demand Under the Evolution Trend of Autonomous Driving
  • Installation Rate of Autonomous Driving (by Level) in China’s Local Passenger Cars, 2024-2030E
  • AI Empowers the Automotive Sector, and Increases Memory Demand from Intelligent Driving
  • Development Trends of Autonomous Driving Systems: Evolution of System Latency and Chip Applications
  • Development Trends of Autonomous Driving Systems: Memory Chip Design Based on NVIDIA Thor
  • Development Trends of Autonomous Driving Systems: Requirements of High-Level Autonomous Driving for Bandwidth and Capacity of Automotive Memory Chips
  • Requirements of Autonomous Driving Systems for NAND Memory
  • Autonomous Driving Systems Further Introduce Advanced NAND Memory Technology
  • Development Trends of Autonomous Driving Algorithms - .
  • Challenges to Automotive Memory in the Era of Foundation Models: Computing Chips Should Not Overemphasize Computing Power but Memory Bandwidth
  • Analysis of Automotive Foundation Model Computing: Calculation Relationship between DRAM Bandwidth and Time per Token - Analysis of Automotive Foundation Model Calculation Relationship between DRAM Bandwidth and Time per Token (2): Calculation Steps : - Calculation Steps
  • Mass-Produced and Deployed Autonomous Driving SoCs: Autonomous Driving SoC Platforms Installed in China’s Local Passenger Cars, 2022-2024
  • L2.5 Highway NOA Computing Platform and Memory Chip Dismantling 1
  • L2.5 Highway NOA Computing Platform and Memory Chip Dismantling 2
  • L2.5 Highway NOA Computing Platform and Memory Chip Dismantling 3
  • L2.9 Urban NOA Computing Platform and Memory Chip Dismantling 4
  • L2.9 Urban NOA Computing Platform and Memory Chip Dismantling 5
  • L2.9 Urban NOA Computing Platform and Memory Chip Dismantling 6
  • Memory Demand Under the Trend of Edge AI Deployment in Cockpit
  • Edge AI Deployment in Vehicles: System Framework
  • Edge AI Deployment in Vehicles: Outlook for AI Application in Vehicle Intelligence
  • Edge AI Deployment in Vehicles: Platform Capabilities Required for AI Deployment
  • OEMs Accelerate Edge AI Deployment 1
  • OEMs Accelerate Edge AI Deployment 3
  • OEMs Accelerate Edge AI Deployment 4
  • OEMs Accelerate Edge AI Deployment 5
  • Edge AI Deployment in Cockpit
  • Mass-Produced and Deployed Cockpit SoCs: Installations of Cockpit SoC Platforms in China’s Local Passenger Cars, 2022-2024
  • Mass-Produced and Deployed Cockpit SoCs: Performance Parameters and Supported DRAM Bandwidth
  • Memory Demand of Central Supercomputing Under EEA Evolution
  • Status Quo of EEA Deployment, and Five-Year Trend Forecast
  • Status Quo of EEA Deployment, and Five-Year Trend Forecast (Appendix)
  • Multi-Domain DCU - Typical Multi-Board Solution
  • Central Computing CCU - Typical One-Chip Solution - Central Computing CCU - Typical One-Chip Solution
  • Under the Trend of Centralized EEA, NAND Memory Requirement Will Reach TB Level in 2025
  • Memory Demand Under the Trend of Automotive Data Recording Compliance
  • Policies and Standards Concerning Automotive Event Data Recorder (EDR)
  • Automotive Event Data Recorder (EDR) System Design
  • Automotive Event Data Recorder (EDR) Generates GB-Level Memory Demand
  • Core Memory Demand of Automotive Event Data Recorder (EDR)
  • New Memory for Automotive Event Data Recorder (EDR)
  • Summary of Automotive Memory Application Trends
  • Memory Demand of Submodules of Intelligent Vehicles
  • Sources of In-Vehicle Data
  • Application and Challenges of Automotive High-Performance NAND Memory Technology in Intelligent Cockpit and Autonomous Driving
  • Directions of Automotive High-Performance NAND Memory Technology Change
  • Application of High-Performance NAND Memory Products in Automotive Market
  • Analysis of Automotive High-Performance Memory Demand
  • Status Quo and Trends of Automotive Memory Application by OEMs - Status Quo and Trends of Automotive Memory Application by OEMs
3 Production, Testing, Certification, and Localization Progress of Automotive Memory Chips
  • Classification of Automotive Memory Chip Vendors
  • Automotive Memory Chip Industry Chain
  • Automotive Memory Chip Industry Chain and Market Pattern
  • Manufacturing Process of Automotive Memory Chips
  • Packaging and Testing Process of Automotive Memory Chips
  • Manufacturing and Packaging & Testing of Automotive Memory Chips
  • Evolution Trends of Chip Packaging Technology
  • Major Advanced Packaging Platforms Worldwide
  • Technology Deployment of Global Advanced Packaging Companies
  • Top 10 Global Outsourced Semiconductor Assembly and Test (OSAT) Rankings
  • Advanced Packaging Companies for Automotive Memory Chips
  • Advantages of System-in-Package (SiP) in Automotive Application
  • Key Technologies and Implementation Modes of System-in-Package (SiP)
  • Application Characteristics of System-in-Package (SiP) in New Energy Vehicles
  • Automotive Chip Packaging Process: Chiplet Offers Flexibility and IP Reusability
  • Capacity Layout of Automotive Memory Chip Wafer Manufacturers
  • Business Models of Automotive Memory Chip Wafer Manufacturers
  • DRAM Products and Technology Trends of Major Wafer Manufacturers
  • Capacity Layout of Automotive Memory Chip Wafer Manufacturers - Capacity Layout of Automotive Memory Chip Wafer Manufacturers
  • Automotive Memory Chip Certification Standard System
  • Supply Chain Entry and Certification Process for Automotive Memory Chips
  • Automotive Supply Chain Certification Standard System and Specifications Required for Automotive Chips
  • Certification Standard System for Automotive Chips: ISO 26262 Functional Safety Level Certification - Certification Standard System for Automotive Chips: ISO 26262 Functional Safety Level Certification
  • Safety Requirements for Embedded Flash Memory Under ISO 26262 Functional Safety Standards: Comparison between Embedded Flash Memory and Off-chip Flash Memory
  • Automotive Chip Certification System: IATF 16949 Quality Management System Certification
  • Summary of Certifications for Automotive Memory Chips - Summary of Certifications for Automotive Memory Chips
  • Localization Level and Progress of Automotive Memory Chips
  • Status Quo of Competition among Chinese Memory Chip Vendors
  • Four Types of Chinese Memory Chip Vendors
  • Summary of Revenues, Gross Margins, and Businesses of Chinese Memory Chip Vendors - .
  • Chinese Vendors’ Automotive DRAM Products (DDR)
  • Overseas Vendors’ Automotive DRAM Products (LPDDR)
  • Chinese Vendors’ Automotive DRAM Products (LPDDR)
  • Overseas Vendors’ Automotive DRAM Products (GDDR)
  • Overseas Vendors’ Automotive DRAM Products (HBM)
  • Overseas Vendors’ Automotive eMMC Products
  • Chinese Vendors’ Automotive UFS Products
  • Chinese Vendors’ Automotive NAND Flash Products
  • Overseas Vendors’ Automotive UFS Products
  • Overseas and Chinese Vendors’ Automotive SSD Products
4 Technology Trends of Automotive Memory Chips by Product Segment
  • Application Trends of Automotive DRAM: LPDDR5X
  • In the Next Stage, Automotive DRAM Will Focus on Promoting LPDDR5X
  • Typical Cases of Application of SoC Platform in LPDDR5X
  • Next-Generation LPDDR6
  • Application Trends of Automotive DRAM: GDDR6/GDDR7
  • GDDR6's High Energy Consumption and Cost Make It Less Preferred by OEMs
  • GDDR7 Standard of JEDEC Solid State Technology Association
  • GDDR7 May Become the Mainstream Choice for Next-Generation Computing Platforms of OEMs
  • Application Trends of Automotive DRAM: High-Bandwidth Memory (HBM)
  • HBM: Production Process and Cost
  • HBM: DRAM and GPU Packaging, for AI Applications
  • HBM Is Primarily Used in High-Performance AI Computing Servers
  • Role of HBM in Transformer AI Models
  • Global Usage of HBM in Major AI Chips
  • Competitive Pattern of Global HBM Vendors
  • Performance Evolution and Development History of HBM
  • Discussion 1 on HBM Application in Automotive
  • Application Trends of Automotive Flash Memory: UFS3.1/UFS4.0
  • Automotive UFS 4.0
  • UFS 4.0 Case
  • UFS 4.1 Case
  • Application Trends of Automotive Flash Memory: PCIe Solid-State Drive (SSD)
  • Purposes of PCIe
  • PCIe Standard Specifications
  • PCIe System Architecture
  • High-Bandwidth, Low-Latency PCIe Bus Is A Key Future Direction for Automotive Memory
  • PCIe-based CXL Memory Technology Will Be Promoted in the Automotive Industry
  • Evolution of Automotive EEA Drives Demand for PCIe SSD Memory
  • Multi-Port BGA PCIe SSD Solution for Central Computers
  • Deployment 1 of PCIe SSD in Vehicles
  • Deployment 2 of PCIe SSD in Vehicles
  • Deployment 3 of PCIe SSD in Vehicles
  • Automotive Memory Trends: In-Memory Computing
  • Conceptual Diagram of In-Memory Computing Technology
  • Technical Solutions of Generalized In-Memory Computing
  • PIM (Processing-in-Memory) Is A Hot Spot in Next-Phase Development
  • True CIM (Computing-in-Memory)
  • CIM Mainly Faces the Memory Medium Technology Path Selection
  • Chinese In-Memory Computing Chip Companies and Their Technology Path Selection
  • Significance of "In-Memory Computing" to Autonomous Driving
  • Memory Solutions for "In-Memory Computing"
5 Automotive Memory Chip Wafer Manufacturers
  • CXMT
  • Business Overview and Capacity Layout
  • DRAM Technology Roadmap
  • Automotive LPDDR4X DRAM
  • Automotive LPDDR5 DRAM
  • DDR4 DRAM
  • DRAM Modules
  • Analysis of G4 DDR5 and Comparison between Inside and Outside
  • YMTC
  • Business Overview and Capacity Layout
  • 3D NAND Technology
  • 232-Layer QLC 3D NAND
  • 3D NAND Technology Iteration
  • Launch of Memory Products Based on Xtacking? 4.0 Architecture
  • Automotive NAND Flash
  • UFS 3.1
  • Layout in Automotive Electronics Sector
  • Subsidiary: Yangtze Mason Semiconductor
  • Yangtze Mason Semiconductor’s eMMC Products
  • Yangtze Mason Semiconductor’s Self-Built Packaging Plant
  • Yangtze Mason Semiconductor’s Automotive SSD
  • Yangtze Mason Semiconductor’s Industrial/Automotive SSD
  • Yangtze Mason Semiconductor’s Automotive eMMC 5.1
  • Yangtze Mason Semiconductor’s Automotive LPDDR4X
  • Subsidiary: Wuhan Xinxin Semiconductor Manufacturing Co., Ltd.
  • Samsung
  • Operation, 2024
  • Operation of Memory Business, 2024
  • Automotive Memory Chip Capacity Layout
  • DRAM and NAND Roadmap
  • Automotive Memory Product Line
  • eUFS Evolution Planning
  • Automotive UFS 3.1 Memory Solutions in Mass Production
  • PCIe SSD Evolution Plan
  • PCIe 5.0 SSD Product Portfolio
  • PCIe 5.0 SSD
  • Automotive SSD - AutoSSD
  • GDDR Memory Evolution Plan
  • Launch of 24Gb GDDR7 DRAM for Next-Gen AI Computing
  • LPDDR Memory Chip Evolution Plan
  • 16GB LPDDR5X + 1TB UFS 3.1 Multi-Chip Packaging Technology
  • Automotive LPDDR5X
  • DDR Memory Chip Evolution Plan
  • Samsung Is Developing DDR6 DRAM Using MSAP Technology
  • HBM Chip Evolution Plan
  • Architecture of PIM-enabled High Bandwidth Memory (HBM-PIM)?
  • Strategic Partnership with SemiDrive
  • Samsung and STMicroelectronics Launched Embedded Phase-Change Memory (ePCM) Technology
  • Samsung Has Used YMTC’s Patented Technology Starting from V10
  • SK Hynix
  • Automotive Memory Chip Capacity Layout
  • Automotive Memory Product Line
  • LPDDR Memory Chip Evolution Plan
  • Automotive LPDDR5
  • HBM Chip Evolution Plan
  • Launch of HBM3E
  • Automotive eMMC 5.1
  • UFS Memory Chip Evolution Plan
  • UFS 3.1 Memory Chip
  • GDDR Memory Evolution Plan
  • AiM (Accelerator-in-Memory) Architecture
  • GDDR7 Memory
  • World’s First 12-Layer HBM4 Samples Shipped, Mass Produced in H2
  • Application Scenarios of Automotive Memory Chips
  • Micron
  • Operation of Memory Business, FY2025Q1
  • Automotive Memory Chip Capacity Layout
  • NAND and SSD Product Lines
  • G9 NAND Technology
  • QLC NAND Technology
  • 4150AT Automotive SSD
  • LPDDR Memory Chip Evolution Plan
  • Automotive LPDDR5X
  • Automotive UFS 3.1
  • Automotive e.MMC 5.1
  • Automotive NOR Flash Products
  • Application Scenarios of Automotive Memory Chips
  • Delivery of 6th-Gen DDR5 Samples (1? DRAM Node)
  • KIOXIA (Toshiba)
  • Successfully Being Listed in 2024
  • Operation of Memory Business, FY2024
  • Automotive Memory Chip Capacity Layout
  • BiCS FLASH 3D NAND Technology
  • Automotive Memory Products
  • Application Scenarios of Automotive Memory Products
  • Automotive UFS 3.1 / 2.1 & e-MMC
  • Western Digital
  • Operation, FY2025Q2
  • Automotive Memory Product Line
  • UFS Memory Chip Evolution Plan
  • UFS 3.1
  • UFS
  • eMMC Embedded Flash Evolution Plan
  • IX SN530 Industry-Grade SSD
  • Silicon Motion
  • Operation, 2024
  • Automotive Memory Solutions
  • UFS Memory Chip Evolution Plan
  • PCIe SSD Evolution Plan
  • Automotive PCIe NVMe SSD Controller
  • Comprehensive Automotive Memory Solutions - Comprehensive Automotive Memory Solutions
  • Automotive Single-Chip Memory Solutions
  • Ferri Series Automotive Memory Solutions
  • Fujitsu
  • Rebranded as RAMXEED, Focusing on High-Performance Memory Business
  • Operation, FY2024Q3
  • Automotive Memory Chip Capacity Layout
  • Building Three Product Lineups: FRAM, ReRAM, and NRAM
  • Summary of FRAM’s Technical Advantages
  • Parameter Comparison between EEPROM, NOR Flash, FRAM, NRAM, and ReRAM
  • Wide Application Fields of FRAM
  • FRAM Applications in Automotive Electronics
  • FRAM Chip Evolution Plan
  • Technical Features of Automotive FRAM Products
  • 4Mbit FRAM Memory, High-Capacity FRAM Empowering Future Vehicles
  • Launch of New NRAM, Combining Advantages of FRAM and NOR Flash
  • New Non-Volatile ReRAM (Resistive RAM)
  • Next-Generation Products
  • Future Plan
  • 0 Neo Semiconductor
  • 3D DRAM Technology
  • X-NAND (3D NAND Flash)
  • 3D X-DRAM
  • 3D X-AI (3D Memory with AI Execution)
  • 1 Nanya Technology
  • Business Scope and Capacity Layout
  • Automotive DDR2 Series
  • Automotive DDR3 1Gb Series
  • Automotive DDR3 2Gb Series
  • Automotive DDR3 4Gb/8Gb Series
  • Automotive LPDDR2 Series
  • Automotive LPDDR4 Series
  • Automotive LPDDR4X Series
  • Automotive MCP Series
6 Automotive Memory Chip Product Manufacturers
  • GigaDevice
  • Revenue, Q1-Q3 2024
  • NOR Flash Product Series
  • Automotive SPI NAND Flash
  • Automotive Parallel NAND Flash
  • DRAM DDR4 Products
  • Ingenic
  • Business Overview
  • Overview of Business Segments
  • Revenue, Q1-Q3 2024
  • LPDDR Memory Chip Evolution Plan
  • DDR Memory Chip Evolution Plan
  • Automotive SRAM Products
  • eMMC Products
  • Xi'an UniIC
  • Business Overview
  • Main Expansion Controller Technology Solution for CXL Memory
  • Embedded DRAM Technology (SeDRAM?) Solution
  • DRAM KGD Solution
  • New-generation DRAM KGD Product Series
  • Automotive Memory Chip Solutions
  • Automotive Ultra-low-power LPDDR4X Memory
  • Automotive DDR3
  • Montage Technology
  • Revenue, 2024
  • CXL Technology Ecosystem
  • Automotive DDR4 RCD Chip (Core Memory Product)
  • DDR5 Memory Interface Chip (RCD/MDB)
  • DDR5 Memory Interface Chip Has Development Potential
  • Longsys
  • Business Overview
  • Revenue, 2024
  • Product Lineup
  • Comprehensive Quality Management to Create High-Quality Automotive Memory
  • Automotive Memory Chip Product Lineup
  • Automotive SPI NAND Flash
  • Self-development of Small- and Medium-capacity Memory Chips
  • UFS Evolution Plan
  • FORESEE Automotive UFS
  • eMMC Evolution Plan
  • Self-development of Memory Main Control Chips and SLC NAND Flash Chips
  • FORESEE Vehicle Monitoring SSD (PCIe 4.0/5.0)
  • FORESEE Automotive LPDDR4X (DRAM cache)
  • Lexar? JumpDrive? Dashcam USB Drive
  • Latest Products
  • XMC
  • Business Overview
  • NOR Flash Foundry Business
  • NOR Flash Products
  • Automotive SPI NOR Flash
  • Giantec Semiconductor
  • Business Overview
  • Revenue, 2024
  • Core Technologies
  • Automotive EEPROM Product Series
  • Automotive SPI NOR Flash Memory Chips
  • Automotive EEPROM Products
  • Automotive EEPROM Applications
  • SPD Product Series
  • NOR Flash
  • Pramor Semiconductor
  • Business Overview
  • Revenue, 2024
  • Automotive NOR Flash Product Line
  • Automotive NOR Flash
  • Automotive EEPROM Product Line
  • Automotive EEPROM
  • Fudan Microelectronics
  • Business Overview
  • Revenue, 2024
  • Memory Product Line
  • Automotive FM Series EEPROM Product Roadmap
  • FM Series EEPROM Products (I2C Automotive)
  • FM Series EEPROM Products (SPI Automotive)
  • Automotive EEPROM Chip FM24C512DA1 Passed Automotive Certification
  • Automotive NOR Flash Product Line
  • Automotive NAND Flash Product Line
  • Automotive NAND Flash
  • 0 Macronix
  • Business Overview
  • Automotive NOR Flash Product Line
  • Automotive NOR Flash MX25L12833F Compatible with NVIDIA Thor
  • Automotive NAND Product Line
  • Armor Flash Memory Applied in NVIDIA DRIVE AGX Xavier and Pegasus Platform
  • Automotive e.MMC Memory Chips
  • 1 Biwin Storage
  • Business Overview
  • Automotive Memory Product Line
  • Automotive LPDDR Evolution Plan
  • uMCP Chips
  • Automotive Memory SATA C

Companies Mentioned

  • CXMT
  • YMTC
  • Samsung
  • SK Hynix
  • Micron
  • KIOXIA (Toshiba)
  • Western Digital
  • Silicon Motion
  • Fujitsu
  • Neo Semiconductor
  • Nanya Technology
  • GigaDevice
Ingenic
  • Xi'an UniIC
  • Montage Technology
  • Longsys
  • XMC
  • Giantec Semiconductor
  • Pramor Semiconductor
  • Fudan Microelectronics
  • Macronix
  • Biwin Storage
  • Winbond
  • SanDisk
  • YEESTOR
  • Dosilicon
  • KXW
  • JingCun Technology
  • Phison
  • Belling
  • Gencun Technology

Methodology

Loading
LOADING...