Information

Vol. 24, No. 4, pp. 68–85, Apr. 2026. https://doi.org/10.53829/ntr202604in1

Report on “NTT R&D FORUM 2025—IOWN∴Quantum Leap”

NTT R&D Forum Secretariat

Abstract

“NTT R&D FORUM 2025—IOWN∴Quantum Leap” was held over five days, from November 19–21 and 25–26, 2025. This report introduces key points from keynote speeches, technical seminars, and technology exhibits at the forum.

Keywords: R&D Forum, IOWN, optical quantum computer

PDF

1. Overview

Marking the 100th anniversary of the birth of quantum mechanics, 2025 has been designated by the Japanese government as the “first year of quantum industrialization.” At NTT, President and Chief Executive Officer (CEO) Akira Shimada and Senior Vice President, Head of Research and Development Planning Shingo Kinoshita heralded “Quantum Leap” as a key message in their keynote speeches.

With an eye on the imminent turning point of the times, NTT R&D Forum 2025 covered a wide range of research and technologies that will be essential to society. NTT’s optical network, the Innovative Optical and Wireless Network (IOWN), was upgraded from versions 1.0 to 2.0 in 2025 and will progress to versions 3.0 and 4.0 from 2028 onwards. The Forum demonstrated how NTT is taking optical computing technology to new heights. It also demonstrated that by combining reliable and proven optical communication technology with quantum technology, NTT will open new horizons by developing and implementing an “optical quantum computer” with overwhelming scalability of 1 million quantum bits (qubits). At the technical seminars held during the Forum, which focused on optical quantum technology, various research and technologies derived from that technology were discussed. The technology exhibition featured 89 cutting-edge research projects and research achievements, including NTT’s large language model (LLM) tsuzumi 2, as well as topics such as sustainability (represented by optical fiber sensing), mobility (represented by autonomous and remote driving), digital twins, security, energy, and technologies using space and artificial satellites.

2. Keynote speeches

2.1 Keynote Speech 1: Innovation in Computing Powered by Photonic Technology—Evolution toward IOWN 2.0 and 3.0, and the Leap to Quantum

At the R&D Forum 2025, Akira Shimada, NTT President and CEO, presented two innovations that will enable photonic technology to break the limits of computing power and energy efficiency (Photo 1).


Photo 1. Akira Shimada.

The first innovation he talked about is photonic computing using IOWN. Replacing electrical wiring with optical wiring dramatically reduces the power consumption and heat generated by large-capacity, low-latency communications between graphics processing units (GPUs) and other devices. He announced that NTT has gradually introduced new technologies, such as a photonics-electronics convergence (PEC) device called PEC-1, optical engine called PEC-2, and PEC switch incorporating PEC-2 devices, all of which are responsible for conversion between optical and electrical signals. He also announced that IOWN 2.0 will feature board-to-board optical interconnects and that NTT developed a high-performance switch with a capacity of 102.4 Tbit/s. This prototype PEC switch consumes approximately one-eighth the power consumed by conventional switches. As stated in NTT’s roadmap, IOWN 3.0 aims to achieve optical input/output (I/O) between packages and miniaturize optical devices by using the membrane (thin-film) device architecture to be mounted on an optical chiplet called PEC-3 (scheduled for commercial use in 2028). IOWN 4.0, targeted for around 2032, aims to introduce optical interconnects inside the package in a manner that reduces power consumption by 99%. He also mentioned that NTT is simultaneously working on improving production lines and linking the supply chain for PEC switches.

The other innovation he talked about is optical quantum computers. NTT is focusing on an optical approach to quantum computing, which operates at room temperature and atmospheric pressure. To achieve overwhelming scalability through the high speed and low power consumption of light and strong affinity with optical communications, NTT is increasing qubit yields by improving the quality of quantum light sources. In collaboration with OptQC and RIKEN, NTT aims to build an optical quantum computer capable of general-purpose, large-scale calculations by 2027 and achieve a world-leading 1 million qubits by 2030. The company also aims to reach 100 million qubits in the future. He also stated that this computing capability will make it possible to solve previously difficult social challenges, such as drug discovery, transportation optimization, and fusion reactor design.

President Shimada also overviewed other quantum computing systems and the challenges that they face. Having a high level of gate maturity, a superconducting system is currently the most advanced in the development of actual equipment; however, he pointed out that it requires large cooling equipment to operate at extremely low temperatures, consumes a large amount of power, and incurs high installation costs in a way that makes it difficult to scale up. A neutral-atom system is also expected to be scalable, but the complexity and stability of the laser and optical system present challenges. All other systems have the common disadvantage of being prone to bottlenecks due to low temperatures, special environments, and control-circuit overheads. Compared with these systems, the optical quantum computer being developed by NTT takes full advantage of the properties of light, enables operation at room temperature and atmospheric pressure, and achieves a compact footprint. In other words, it is compatible with NTT’s optical communications technology and does not require cooling infrastructure or large equipment, thus is advantageous in terms of power consumption and cost. NTT is applying its longstanding expertise in optical communications to quantum light sources and quantum measurement, and by combining this expertise with optical chiplets and optical I/O, it aims to create a highly energy-efficient next-generation infrastructure that combines optical computing and optical quantum computing.

President Shimada concluded his keynote speech with a powerful message: “NTT is using optical technologies to break through the limits of energy consumption and conventional computational processing, driving innovation in computing. The world is currently undergoing unprecedented transformation driven by artificial intelligence (AI), and NTT not only provides the infrastructure to support the AI era but also contributes to a sustainable future through innovation in computing for the forthcoming quantum era.”

2.2 Keynote Speech 2: IOWN∴Quantum Leap

The keynote speech by Shingo Kinoshita, NTT Senior Vice President, Head of Research and Development Planning, was titled “IOWN∴Quantum Leap” and covered topics ranging from the arrival of the AI era to quantum technology. In his speech, he overviewed the outlook for the optical quantum computer currently being researched and developed by NTT, quantum AI, and the research and development (R&D) presented at the R&D Forum 2025 (Photo 2).


Photo 2. Shingo Kinoshita.

He first presented an example of NTT’s exhibit at Expo 2025 Osaka, Kansai, Japan, where three-dimensional (3D) data of performers on stage in a remote location was transmitted in real time to the NTT Pavilion in a manner that created an experience in the pavilion that synchronized 3D images, vibrations, and lighting.

He next explained the meaning of the key message “Quantum Leap,” that is, a leap forward in quantum mechanics and dramatic progress in business. He explained the four approaches that NTT has taken to address the issue of the explosive growth in computing resources and power consumption in the AI era. These approaches include improving the efficiency of the AI execution environment through IOWN and using the generative AI (GenAI) tsuzumi within the scope of classical information processing to make the AI lighter and more efficient. Other approaches expand the scope of information processing from classical to quantum. These include optical quantum computers, which achieve overwhelming improvements in computational performance, and quantum AI, which is a highly efficient AI similar to that of the human brain. He outlined NTT laboratories’ thinking in advancing these efforts by combining classical and quantum approaches and introduced research results on quantum computing, IOWN, and GenAI.

Regarding quantum computing, he stated that NTT is working with OptQC to achieve a scale that far exceeds the current technological level, calculations with 10,000 qubits by 2027 and 1 million qubits by 2030.

In the IOWN 2.0 phase of the IOWN roadmap, PEC-2 devices will be used to bring light close to the application-specific integrated circuit switch, thus shorten the electrical wiring and dramatically reduce power consumption. NTT aims to use this PEC technology on the hardware side as we aim to commercialize PEC-2 by the end of fiscal year 2026. On the software side, he introduced an initiative in which the DCI (Data-Centric Infrastructure) controller dynamically optimizes the configuration of distributed GPU clusters to control latency, power consumption, and resource allocation in real time, maximizing the energy efficiency and processing performance of large-scale datacenters.

In the field of GenAI, he emphasized the superiority of tsuzumi 2, a large language model (LLM) developed from scratch by NTT. It features outstanding Japanese-language performance, improved efficiency in developing specialized models, low cost, and high security. Example applications of tsuzumi 2 that he mentioned include natural dialogue using a full-duplex speech-to-speech system, automation of network operations using AI agents, and marketing optimization and mobility prediction using large action models. Regarding research on artificial general intelligence and artificial super intelligence, NTT is focusing on the verbalization of brain activity (Mind Captioning), machine unlearning that enables the deletion of specific knowledge in models, and neuronal analysis that addresses “lies” within LLMs. Finally, he explained the potential of quantum AI and the perspective of exploiting quantum noise and concluded with the words of the first director of the Electrical Communication Laboratory of the Ministry of Communications (the predecessor of NTT laboratories): “Do research by drawing from the fountain of knowledge and provide specific benefits to society through commercial development.”

3. Technical seminars

The two-day technical seminars covered four themes: tsuzumi 2, Physics of Intelligence, Quantum × IOWN (Business), and Optical Quantum Computing (Technology), and involved lively discussions.

3.1 Technical Seminar 1: tsuzumi 2 and the New Landscape of AI Business: Challenges and Prospects of a Japan-originated LLM

The opening remarks were made by Kyosuke Nishida, Senior Distinguished Researcher at NTT Human Informatics Laboratories, Taichi Asami, Senior Research Engineer at NTT Human Informatics Laboratories, and Hiroki Arakawa, Generative AI Task Force Leader at NTT DOCOMO BUSINESS Corporation (Photo 3). They introduced AI businesses and their prospects using tsuzumi 2 as well as various related technologies and specific examples of related businesses.


Photo 3. (From left) Kyosuke Nishida, Taichi Asami, and Hiroki Arakawa.

Mr. Nishida first explained the features of tsuzumi 2, which was launched commercially on October 20, 2025, and its improvements compared with the first version. The main features of tsuzumi 2 are that it can be run in an on-premises environment and was developed independently by NTT from scratch. With 28.6 billion parameters, it is a dense model designed to be high performance yet easy to run on a single GPU. Featuring a tokenizer optimized for Japanese language and pre-trained on approximately 10 trillion tokens, tsuzumi 2 has a high level of Japanese comprehension and generation capabilities and can fully grasp and manage data as “sovereign AI.” Its learning process involves pre-training, supervised learning using instructions, and repeated (hundreds of times) preference-based alignment in a manner that achieves task execution ability and safety. Emphasizing the effectiveness of tsuzumi 2 for multilingualism and business applications, the demonstrations on stage included examples of creating reports in a specified format from patents and papers and correcting emails from non-Japanese with writing proficiency in hiragana script only into fluent Japanese with explanations in English. He expressed his hope for the future to create robots that grow alongside people as “life partners.”

Regarding the area of voice dialogue, Mr. Asami explained the importance of voice interfaces and the difficulty in perfecting them, namely, control of “when to speak” (i.e., turn-taking). The new voice dialogue AI developed by his lab uses high-speed streaming processing in 0.1-second increments to control response timing and interjections naturally, solving the overlapping speech problem that affected the previous model. He explained this feature by using a demonstration of AI-to-human communication through speech. He also emphasized that collaboration with tsuzumi 2 will be strengthened with the goal of enabling smarter conversations, controlling inappropriate remarks, and even controlling speaking style according to tone of voice and time, place, and occasion.

From the perspective of the business side, Mr. Arakawa spoke about the arrival of AI agents and pointed out the importance of planning capability, functional collaboration that acts as the agent’s hands and feet, the utilization of short-term and long-term memory, and individuality. Specific use cases that he demonstrated include sales support including real-time summarization, suggestions for next actions, and mock sales negotiations, and practical patent-application support ranging from generating ideas for inventions to writing specifications. The key to deploying AI agents is composed of three elements: (i) knowledge specific to the industry and business, (ii) management through collaboration with humans (human in the loop/human on the loop), and (iii) security that guarantees the safety of AI agents (running in on-premise and private-cloud environments). These elements contribute to resolve the “last mile” issues, which include the transfer of tacit knowledge and the handling of domestic data.

This seminar shared the vision in which decentralized and efficient AI groups formed through AI Constellation enable a society in which AI grows and coexists with people and concluded that the evolution of voice interfaces is essential for social acceptance.

3.2 Technical Seminar 2: Physics of Intelligence: Exploring the Principles of Emergence of Intelligence

Hidenori Tanaka, Senior Research Scientist at the Physics & Informatics Laboratories of NTT Research, Inc., and Tomoyasu Horikawa, Distinguished Researcher at NTT Communication Science Laboratories, gave their presentations on the topic of “How to facilitate smoother communication between AI and people” (Photo 4).


Photo 4. (From left) Hidenori Tanaka and Tomoyasu Horikawa.

Mr. Tanaka, a Silicon Valley-based researcher in the fields of neuroscience, mind, and intelligence, first positioned AI as an invention that will bring about a paradigm shift comparable to the steam engine of the First Industrial Revolution. Through his highly unique considerations such as “intelligence in AI” and “whether or not AI is creative,” he considers AI as a new observation window for understanding the mind from a physical perspective. To study the human mind and intelligence, it is inevitable to take a psychological approach based on counseling or other means; however, to study AI, a bold physical approach becomes possible. Mr. Tanaka emphasized that AI can observe everything and as “an intelligent object whose internal structure can be observed,” it offers a good opportunity to explore the principles of the mind and intelligence mathematically. Current AI, a so-called neural network, is a collection of neurons that mimics the human brain; essentially, it is deep learning. The underlining mechanisms of AI are not well understood, but we know, in practice, that as the scale of AI (model size, amount of data, number of calculations, etc.) increases, it suddenly acquires conceptual understanding. Giving concrete examples, he showed how AI can gradually learn concepts such as color, shape, and relationships, and he presented research results showing how AI can derive “never-seen combinations (such as a woman wearing a hat)” by manipulating internal vector space and using concept direction methods. He thus showed that AI has hidden capabilities and that, if properly guided, it can produce creative output.

He also emphasized the importance of the relationship between AI and humans. He spoke of the differences in AI-related values between the West and East and explained the need to design not only “alignment” (i.e., adjustment of values) but also “healthy relationship between humans and AI.” In that regard, he spoke of attempts to define kindness and long-term effects mathematically as well as the importance of collaboration with psychiatry. Mr. Tanaka’s research results are beginning to influence policy and safety standards at National Institute of Standards and Technology (NIST) in USA.

Mr. Horikawa then gave a presentation on “Exploring human thought through brain activity.” He presented his research results on decoding brain-activity data, which is the process of returning encoded or compressed data to its original readable or usable form so that it can be understood by humans and computers. He introduced “Mind Captioning” technology, which converts human brain activity (measured using functional magnetic resonance imaging) into an “AI expression space” then converts the content someone is seeing or imagining into text. This technology uses captions attached to videos to learn “brain-to-machine representation mapping,” namely, using masked language models (guessing hidden words in a sentence) and iterative optimization (repeated trial-and-error to derive a better answer) to find the explanatory text that most closely matches brain activity. This process goes beyond conventional limited category recognition; it improves the accuracy of identifying and predicting unlearned categories and makes it possible to extract content during recall. The fact that the accuracy of this technology drops when word order of generated sentences was shuffled indicates the successful generation of descriptions that accurately captures the structural information related to relationships between visual elements. Since this technology can decipher non-verbal visual thinking without using a language network, it is expected to be applicable not only to AI but also to communication support for people with aphasia and other language-inhibiting disorders as well as to brain-machine interfaces.

3.3 Technical Seminar 3: Quantum × IOWN: Social Change and the Future of AI in Business

This technical seminar was divided into two parts: a technical-side explanation of optical quantum computers by Daisuke Shirai, Senior Research Engineer at NTT Network Innovation Laboratories, and a business-side explanation of optical quantum computers by Takashi Yazane, Manager at the Innovation Technology Department of the Technology and Innovation General Headquarters of NTT DATA Group Corporation (Photo 5).


Photo 5. (From left) Daisuke Shirai and Takashi Yazane.

Mr. Shirai mentioned a November 18, 2025 press release announcing that NTT is collaborating with OptQC to build a general-purpose optical quantum computer with a capacity of 1 million qubits by 2030. He noted that, first, an optical quantum computer has the advantage of being highly scalable because it handles qubits by using optical communication methods and, second, that it consumes less power compared with other quantum computing systems because it does not require cooling or large-scale control. He explained that this computer made it possible to reduce quantum noise by using squeezed light generated using an optical parametric amplifier (OPA), and achieved the world’s lowest level of 8 dB, and is therefore close to being put into practical use.

Quantum computing uses superposition and interference of qubits to efficiently derive solutions from exponential combinatorial spaces (a task that is difficult for classical computers to address). This characteristic makes it an excellent choice for combinatorial optimization and molecular simulation. Optical quantum computers can be scaled up by multiplexing time, wavelength, and space, and NTT aims to achieve terahertz-class high-speed operation using optical clocks and compact rack-scale implementation. However, he pointed out that challenges to be overcome include measures to deal with loss in quantum light, reducing fiber loss in larger-scale applications, building optical circuits for higher speeds, and improving packaging and silicon photonics for implementation. The roadmap toward 1 million qubits calls for beginning to demonstrate use cases from around 2027 and building a general-purpose, large-scale optical quantum computer with error tolerance by 2030. Thinking of the longer term, he outlined a grand vision of linking quantum communications and quantum sensors to build a global quantum computer network.

Mr. Yazane cited technological advances, investigating use cases, government investment, and the surge in demand for high-performance computing due to AI and rising data volumes as factors driving expectations for quantum technology, and he summarized these factors from a business perspective. He also explained that since existing semiconductor architectures face performance limitations and power challenges, quantum architectures are a promising solution. Quantum systems come in two types: gate-based (general-purpose) and quantum annealing (specialized) known as Ising machines. He emphasized that due to differences in application and maturity, it is important to test the right system for the right application. The diverse range of expected application areas of quantum systems include chemistry and drug discovery, finance, and AI as well as transportation and logistics, manufacturing optimization, and security (post-quantum cryptography).

Specific examples of initiatives that he presented include optimization of glass cutting, joint development of an “odor-recreation platform,” and a competition to optimize vehicle-testing processes. These initiatives are focused on short-term and long-term continuity, from short-term performance verification using current quantum machines to business creation with an eye toward future large-scale quantum machines. NTT DATA will provide comprehensive support for such initiatives, including use-case investigation, simulation, verification environments, and datacenter infrastructure, while emphasizing that developing technologies, human resources, and an ecosystem are key tasks.

Overall, “quantum × IOWN,” targeted for societal implementation by 2030, has the potential to create—through low power consumption and high scalability—a competitive advantage in fields such as chemistry, logistics, and finance. However, achieving this target will require overcoming multiple technical challenges, including hardware integration, error correction, network integration, and implementation technology. The core of Mr. Yazane’s presentation was the need to examine use cases and build an ecosystem simultaneously.

He also pointed out that to move towards practical application, preparations concerning social systems including standardization, development of legal frameworks, data protection, and privacy measures and transition to post-quantum cryptography are essential. While companies and research institutes will verify the immediate effect of quantum × IOWN through proofs-of-concept in the short term, in the long term, they will need to continue investing in fundamental technologies, develop human resources, and strengthen industry-academia-government and international collaboration to foster an ecosystem. Advancing standardization and open technology sharing will ensure compatibility and security and enable risk management and business value to be balanced. Accomplishing these tasks is the key to accelerating the societal implementation of quantum × IOWN.

3.4 Technical Seminar 4: Optical Quantum Computing Illuminates the Future—From the Beginning to Cutting-edge Technology

At this technical seminar, Kan Takase, Representative Director and CEO of OptQC and Takeshi Umeki, Senior Distinguished Researcher at NTT Device Technology Laboratories, spoke about the current state of quantum computers and the challenges facing them as well as new technologies that could be breakthroughs in overcoming these challenges (Photo 6).


Photo 6. (From left) Kan Takase and Takeshi Umeki.

Mr. Takase first explained the background to the founding of OptQC, which originated from the Furusawa Laboratory at the University of Tokyo, and proposed two groundbreaking transformations to solve the serious energy-consumption problems facing current computing infrastructure: first, the transition from conventional (classical) computing to quantum computing and, second, the transition from electrical signals to optical signals.

Quantum gates are created by “quantum teleportation” (i.e., measurement-induced quantum computing). This process requires a large amount of quantum resources in an entangled state, but the scalability of the physical implementation can be achieved by multiplexing the time domain. It thus becomes possible to reuse high-speed cores and multiplex the number of qubit inputs over time, preventing the hardware from becoming too large without incurring any real-world consequences. Giving an example of keeping hardware scale constant, he introduced an optical-gate system (MQC3) installed at RIKEN that can accommodate 100 input quantum signals and connect to the cloud and demonstrated that the size of systems created in the past is almost the same as the latest model currently being created. OptQC is currently developing a modularized 100-qubit unit (due for completion in 2026). It has announced plans for a second unit, which is expected to be completed around 2027, that uses OPAs and other devices to aim for 100 times the current clock and input count (10,000 qubits). He said that his goal is to create an “all-optical quantum computer” in which all systems that are currently based on electricity, such as measurement and control, are replaced with ones controlled by light. He also said that he has high hopes for future technologies that will emerge from the application of integrated photonics such as silicon photonics and thin-film lithium niobate.

Mr. Umeki explained the technical details of periodically poled lithium niobate (PPLN) devices and their compatibility with optical communications. Optical quantum technology and digital coherent technology share the fundamental components of a light source, transmission line, and receiver and handling information using the phase and amplitude of light (so-called IQ plane). However, optical quantum technology has quantum-specific requirements such as quantum superposition, quantum entanglement, and loss sensitivity. Mr. Umeki explained that PPLN devices are needed to compensate for and rectify these differences. He reported that PPLN devices have achieved highly efficient interactions in a way that significantly improves amplification gain and conversion efficiency. In particular, the phase-sensitive amplification mode has enabled low-noise amplification and the generation of squeezed light (currently with quantum noise of over 8 dB).

In regard to quantum technology, he also emphasized the importance of generating non-Gaussian states and pointed out that quantum signals are vulnerable to loss; however, adding loss tolerance by using phase-shift amplification makes it possible to use high-speed detectors and circuits used in communications and thereby measure EPR (Einstein–Podolsky–Rosen) correlation (non-local correlation between particles in a quantum-entangled state) in the 43–60-GHz band. Finally, he concluded that PPLN can be used as not only a quantum light source but also a pre-amplifier that converts classical light to quantum light and vice versa. In other words, it is a key technology to promote the integration of optical communications and optical quantum computing.

Referencing the continuous variable representation of light and time-domain multiplexing, Mrs. Takase and Umeki presented a shared vision of incorporating optical communication technologies developed by NTT for optical communications, such as ultra-wideband transmission, coherent detection, and integrated waveguides, into quantum computing. With the goal of practical application of scalable, high-clock optical quantum computers, it is necessary to overcome technical challenges such as improving error correction, dealing with loss, and photonic integration and fusion of photonics and electronics. They concluded that improving the implementability through industry-academia collaboration, standardization, and commercial modularization are also key factors to address these challenges.

4. Technology exhibition

The technology exhibits at NTT R&D Forum 2025 were divided into five sections covering ten themes: “Generative AI,” “IOWN,” “Quantum,” “Sustainability,” “Mobility,” “NW (Network),” “Security,” “Aerospace,” “Digital Twin,” and “UI/UX (User Interface/User Experience).”

As for key exhibits, the Generative AI section showcased tsuzumi 2, the latest lightweight LLM, and natural interactive dialogue technology, and the IOWN section showcased the latest PEC devices and optical computing demonstrated at Expo 2025 Osaka, Kansai, Japan. The Quantum section showcased a wide range of research results—ranging from hardware to software—on world-leading optical quantum computing based on NTT’s longstanding optical communications technology. The Sustainability section showcased algal breeding technology using neutron-beam irradiation and hydrogen-transportation technology. The Mobility section showcased a world model for intelligent transportation, which aims to eliminate traffic accidents. The NW section showcased AI-based autonomous network recovery and the use of satellites and optical fiber to support social infrastructure. The Security section introduced Web3-related technologies and digital trust platforms. The Aerospace section showcased the latest technologies aimed at building the Space Integrated Computing Network. The Digital Twin section showcased affordable, precise large-scale 3D scanning and automatic control technology for robots. The UI/UX section showcased the latest initiatives through hands-on demos, which included vibrotactile transmission via IOWN, and active noise control.

4.1 Generative AI

4.1.1 tsuzumi: evolving large language model

NTT has released a new version of its LLM, tsuzumi 2. While being a lightweight model capable of running on a single GPU, tsuzumi 2 offers world-class Japanese-language performance that approaches that of ultra-large models, and is achieving an excellent balance of operational efficiency and performance. Its primary feature is its enhanced capabilities, which can be frequently used in business situations. Its ability to handle document-related Q&A tasks, which account for 80% of its usage, and document-related information extraction and summarization tasks have been enhanced.

Since tsuzumi 2 was developed to operate on GPUs with 40 gigabytes of memory or less, it is easy to operate on a closed circuit within a single company or organization. This feature allows for safe handling of highly confidential information in a manner that significantly reduces the risk of leaking trade secrets and ensures extremely high security. Since it has particularly extensive knowledge in the finance, local government, and medical fields, it demonstrates excellent performance in many cases such as ensuring Japan’s economic security, eliminating the digital deficit, and strengthening the AI industry.

At this exhibit, “Orchestrator” was introduced as an example of future use of tsuzumi (Photo 7). As the name suggests, tsuzumi is given the role of an orchestra conductor, and this orchestrator controls business operations. For example, if given a proposition such as devise “countermeasures for the decline in sales at branch A this fiscal year,” it will gather information such as past sales data and the details of complaints received from customers to arrive at an answer. If it cannot gather sufficient information, the orchestrator will, at its own discretion, use the chat function to inquire directly with the person in charge and acquire the missing information.


Photo 7. tsuzumi: evolving large language model.

While other companies have released a variety of AI models, tsuzumi 2 is a purely Japanese model developed by NTT from scratch. Its reliability was thus ensured during the development process in a way that makes it truly an AI made for Japanese people.

4.1.2 Fake content prevention technology

To protect the reliability of content (photos, videos, etc.), we have developed countermeasures against fake content (Photo 8). The Coalition for Content Provenance and Authenticity (C2PA) currently provides content-provenance information that can be easily acquired by anyone using a smartphone or tablet computer. C2PA is also the abbreviation for the standards organization that develops technical specifications for authenticating the origin and provenance of content. It thus has become a term that is used synonymously with the provenance and authenticity of content. However, data information such as the location, date and time, and equipment used can be easily tampered with by using various applications. Anyone can easily modify content by using GenAI. Consequently, C2PA is no longer a guaranteed source of data, and it cannot be guaranteed that the content we see every day has not been tampered with.


Photo 8. Fake content prevention technology.

Our fake content prevention technology simplifies fact-checking by users by verifying authenticity at the time of capture then signing the data. Consisting of an authenticity check module and an authenticity check tool, it verifies the authenticity of information such as the date and time of capture and location information and determines the authenticity of the content. It also enables users to check the original version of edited content and the original field of view of cropped content. This function makes it easy for individuals to fact-check content that has been pre-signed. One current issue is that this information can only be added to content captured using a dedicated app. Future goals include incorporating this technology into apps that require it and making it a standard feature on devices.

4.1.3 Superfast software development technology with GAI

Used in the field of new and incremental software development, this technology uses GenAI to achieve ultra-fast, low-cost, and high-quality output to reduce and shorten software development time and operations and processes by 40% (Photo 9). Developing high-quality software that reflects individual user needs currently requires many manual processes. This method requires each developer to understand the entire project, and that requirement has many disadvantages in terms of cost and time. While GenAI has been used in the coding (implementation) process, it has struggled to understand large amounts of context (e.g., situation and line of thought). Software development requires knowledge specific to the software under development. Therefore, general-purpose GenAI, which only has general knowledge, is unable to perform this advanced task, so human intervention is required.


Photo 9. Superfast software development technology with GAI.

With those issues in mind, we developed a technology that analyzes the diverse data required for software development from multiple angles, understands their dependencies, and builds a knowledge database. GenAI is then tasked with autonomously selecting the data required for each development task from this knowledge database and executing the task appropriately and accurately. The amount of manual work required can thus be minimized in a way that makes it possible to develop high-quality, cost-effective software.

4.1.4 Auto-device-failure diagnosis with GenAI

When a problem such as an inability to connect to the Internet at home occurs, the customer contacts a call center, which, after understanding the situation, NTT contacts a repair center then sends a technician to the customer’s home to fix the problem. However, approximately 50% of such inquiries are for minor problems such as a cable coming loose from optical network units (ONUs). For problems other than these minor breakdowns, this technology enables users to self-check the status of devices such as ONUs by using high-precision AI, i.e., LLMs and vision-language models (VLMs) (Photo 10).


Photo 10. Auto-device-failure diagnosis with GenAI.

However, current LLMs and VLMs are general-purpose AI that do not have highly specialized domain knowledge; accordingly, AI specialized for determining faults in communications devices is required. To satisfy this requirement, NTT has built an AI agent capable of handling multimodal input, and by combining a VLM with technologies such as image processing and peripheral technologies such as a user-friendly UI, we have automated the device-failure diagnosis. Support is provided by phone or online, but our aim is to provide this technology as a smartphone application.

4.2 IOWN

4.2.1 Dynamic watt-bit collaboration

As a new concept, “watt-bit collaboration” aims to build a sustainable and efficient social foundation by linking electricity (watts) and information and communications (bits) to promote infrastructure development in an integrated manner. NTT is working on “dynamic watt-bit collaboration,” which enables effective use of renewable energy, with the aim of achieving watt-bit collaboration and making datacenters carbon-neutral (Photo 11). For example, dynamic watt-bit collaboration aims to enable workload shifting, which moves the learning and inference workloads of GenAI between multiple datacenters connected by the All-Photonics Network (APN) in accordance with the electricity supply and demand situation, including renewable energy. It also aims to enable charge and discharge control using storage batteries.


Photo 11. Dynamic watt-bit collaboration.

To achieve the above-mentioned aims, it is necessary to use renewable energy more inexpensively and efficiently, so it is essential to implement technology that can accurately predict the ever-changing amounts of power generation and consumption and devise optimal control plans for each type of resource. Considering these issues, NTT is developing technologies for unified management, prediction, optimization, and control that integrate information on power and information and communications resources. The prediction accounts for geographical and temporal characteristics to comprehensively and accurately estimate renewable-energy generation as well as electricity prices and trading volumes. Regarding the optimization process, workload-allocation and battery-control plans are developed from various predicted power data, and reflected in the control system in real time, and those plans contribute to maximizing renewable energy usage and minimizing costs. The workload and battery storage are thus controlled according to an optimized plan. It will thus become possible to maximize the use of renewable energy while minimizing electricity costs, enabling economical datacenter operations and significantly contribute to the goal of achieving carbon neutrality worldwide.

4.2.2 Real-time data synchronization

This exhibit showcased the results of testing the long-distance virtual storage platform, which, in combination with the IOWN APN, synchronizes storage devices in remote locations in real time over long distances and treats them as if they were a single storage device (Photo 12). Regarding conventional communication lines, the distance that the long-distance virtual storage platform can achieve is limited to a practical upper limit of approximately 100 km. The round-trip response time of the communication line is a major factor in this limitation, and a round-trip response time of less than 20 milliseconds is required to synchronize multiple storage devices.


Photo 12. Real-time data synchronization.

In a field demonstration using the IOWN APN, NTT achieved the world’s first successful data-storage synchronization over 600 km. The round-trip response time that we achieved was approximately 7.5 milliseconds over approximately 600 km (equivalent to the distance between Tokyo and Osaka). In this demonstration, the synchronization distance was gradually increased to 200, 400, and 600 km, and although latency increased in proportion to distance, it was confirmed that synchronization performance could be maintained even over 600 km.

These results indicate that, in theory, using the IOWN APN will enable data-storage synchronization between remote locations separated by approximately 1600 km, which roughly covers the distance from Tokyo to Hokkaido in the north and Okinawa in the south.

In this exhibit, a demonstration environment based on the results of the above field demonstration was built, and visitors were able to see that when the system on the Tokyo side was intentionally stopped, processing was automatically switched to the storage on the Osaka side, and backup and recovery proceeded without interruption.

This technology is expected to become an important foundation that will contribute to the implementation of distributed datacenters for companies that require constant availability, such as financial institutions and social-infrastructure operators. For example, when a datacenter in one region stops operating due to a large-scale natural disaster or power outage, if synchronized data in another region can be used as is, services can continue without having to initiate switching or recovery efforts, and important business data and services can be protected without interruption.

4.3 Quantum

4.3.1 Optical quantum computing for the future

One of the highlights of the R&D Forum 2025 was the optical quantum computer. This large-scale exhibit centered around a model of an actual optical quantum computer, a rack-type model representing NTT’s vision for the future, and accompanying video commentary explaining the operating principles of the computer (Photo 13).


Photo 13. Optical quantum computing for the future.

Although quantum computers are currently being developed in various countries, for them to be put to practical use, three challenges must be met: first, scalability (Can useful calculations be performed?); second, high calculation accuracy (Are the calculation results accurate?); and third, feasibility (Are power consumption and device size realistic?). Of the quantum computers currently being developed, none meet all three challenges. However, the development of quantum computers that significantly surpass the performance of current supercomputers is now an essential proposition for the future.

Quantum computers currently being developed include superconducting, neutral-atom, ion-trap, and semiconductor types. In collaboration with the University of Tokyo, RIKEN, and OptQC, NTT is developing a different type of quantum computer known as an optical quantum computer. Features of our optical quantum computer include its ability to operate at room temperature and normal pressure, its space-saving design achieved through time and wavelength multiplexing, and its high speed achieved by operating at optical frequencies, by taking advantage of the properties of light. It also features an extremely high affinity with optical communications technology, which has the advantage of being able to use the technology that NTT has cultivated over many years. It also boasts overwhelming scalability that makes it easy to increase the number of qubits. The final system is expected to be extremely compact for a quantum computer; it measures only approximately 60 cm wide × 125 cm high × 80 cm deep.

By using the high-speed, low-power optical quantum computers leveraging NTT’s optical communications and transmission technologies, we aim to implement general-purpose large-scale systems around 2030 and address social issues that had been intractable.

4.4 Sustainability

4.4.1 Hydrogen-transportation technology

To add hydrogen to existing energy infrastructure, such as electricity and gas, NTT is developing hydrogen-transportation technology (Photo 14). While hydrogen has the potential to become the ultimate green energy source, its widespread adoption faces major challenges, including a lack of hydrogen stations and the difficulty of transporting hydrogen gas. When hydrogen is being transported through pipelines (steel pipes) or in gas cylinders, hydrogen atoms are absorbed from the metal surface into the material in a manner that weakens the bonding strength between the metal atoms with the accumulated hydrogen atoms by a phenomenon known as hydrogen embrittlement. To prevent hydrogen embrittlement, pipes and cylinders must be constructed with special metals and materials that are resistant to hydrogen embrittlement. Since hydrogen is the lightest and smallest diatomic molecule on Earth, it permeates more materials than other gases. Considering this fact, we developed a special double-piping system that can address these challenges by using existing underground space in urban areas and achieving low costs. The core of this hydrogen-transportation technology is it enables us to establish a system for safely supplying hydrogen by burying this pipeline.


Photo 14. Hydrogen-transportation technology.

Another feature of this hydrogen-transportation technology is its safety. As a safety measure, a method of adding an odor to the hydrogen to make leaks detectable, in a similar manner to that used for city gas, has been proposed. However, it is costly and the odor’s components can cause equipment such as fuel cells to fail. Therefore, we have developed a new pipeline system (a kind of safety technology) that enables hydrogen to be supplied without adding an odor to it. To further ensure safety, optical fiber is passed through the pipes to detect leaks and other mal­functions. In the unlikely event of a hydrogen leak, the system is also equipped with “dry-air devices” that remotely expel any leaking hydrogen in gaps in the double piping. The goal of this technology is to establish a hydrogen supply chain using these anomaly-detection technologies and safety measures and create a society where hydrogen is the main energy source.

4.5 NW

4.5.1 Cavity estimation using fiber sensing

In a press release prior to the R&D Forum, we unveiled a technology that uses existing optical fibers for communications to estimate the presence of cavitation in the ground. This technology predicts the signs of road collapses, as exemplified by the recent collapse in Yashio City, Saitama Prefecture. This collapse attracted a great deal of social attention, but conventional technology used to investigate the collapse mainly relied on measurements taken from the ground by, for example, vehicle-based surveys using vehicles equipped with electromagnetic-wave detectors or ultrasonic radar. However, there are two drawbacks of such measurement methods. First, frequent measurements to detect changes in the situation are unfeasible because they can only be carried out once every few years due to cost. Second, such measurements can only estimate locations less than 3 m below the surface. Cavity surveys require monitoring technology suitable for deep underground areas (depth of more than 3 m). Two technologies have attracted attention: “geotechnical technology,” which involves individually placing special arrays, devised by the National Institute of Advanced Industrial Science and Technology (AIST) and “optical fiber sensing” developed by NTT (Photo 15). A joint research project between AIST and NTT verified that the two technologies achieve roughly the same performance. However, AIST’s technology requires the installation of new arrays at the target locations; in contrast, NTT’s optical fiber cables are already laid underground throughout the country for communications purposes, so they can be used for monitoring underground conditions remotely and at low cost. What’s more, an underground cavity progresses gradually, so frequent, near-constant monitoring is required to determine the extent of its progress. From this aspect, optical fiber sensing has an advantage. NTT’s optical fiber sensing technology will be tested in cooperation with local governments across the country in fiscal year 2026, and it is planned for pre-service introduction in 2027. NTT hopes to improve the accuracy of this technology, thus enable early detection of road collapse risks to contribute to the safety and security of local communities.


Photo 15. Cavity estimation using fiber sensing.

4.6 Security

4.6.1 Data security technology with robust key management

NTT developed a cost-effective and convenient encryption key for reducing incidents related to cloud encryption key management and ensuring data security (Photo 16). Used for data protection, authentication, digital signatures, etc., an encryption key is a string of characters used in an encryption algorithm to convert confidential data into a format that cannot be understood by third parties. Encryption keys are usually protected and managed within the cloud service that uses them; however, if an incident such as human error or internal fraud occurs during the provider’s management of the encryption key, data security could be breached. This technology, which is also compatible with post-quantum cryptography, ensures strong security by centrally managing the generation and operation of cryptographic keys in a so-called cloud-based trusted execution environment. NIST in USA evaluates and certifies algorithms, and acquiring their certification proves that a crypto library is trustworthy. NTT is currently the only company in Japan that possesses encryption key technology that can pass this evaluation test. By using this encryption key, government agencies and companies that manufacture important materials can protect and ensure the safety of important data such as national and trade secrets.


Photo 16. Data security technology with robust key management.

5. After the Forum

More than 23,000 people attended NTT R&D Forum 2025. The signing of a collaboration agreement between NTT and OptQC, a startup company spun out of the University of Tokyo, to build an optical quantum computer was a symbolic event toward putting quantum technology into practical use.

In addition to tsuzumi 2, NTT’s lightweight LLM boasting world-class performance, many visitors also took notice of the numerous technologies that will contribute to society in ways that many people can experience, such as the application of world models in the transportation sector and optical fiber sensing technology.

The NTT Group will continue to develop technologies that address social issues, provide new experiences and excitement, and create a new future together with everyone. Stay tuned for NTT’s R&D to see what quantum leaps and dramatic advances will take place in the business world and what kind of future awaits as IOWN and AI are combined with quantum computers.


Members of NTT R&D Forum Secretariat: (From left) Takanori Watanabe, Kenji Kobayashi, Naho Matsubara, Yoko Ono, Takayoshi Mochizuki, Hironari Yokoi, and Yuichi Maki.

↑ TOP