You need Adobe Reader 7.0 or later in order to read PDF files on this site.
If Adobe Reader is not installed on your computer, click the button below and go to the download site.
|
November 2024 Vol. 22 No. 11 |
|
View from the Top
- Takaaki Sato, Senior Executive Vice President, Head of R&D Innovation Division, NTT DOCOMO
Abstract
As a leader in the global mobile communications scene, NTT DOCOMO is pursuing coexistence of artificial intelligence and humans, construction of sustainable networks, and development of innovative technologies. We spoke to Takaaki Sato, senior executive vice president of NTT DOCOMO, about the company’s technology strategy for creating a new world of communication culture and outlook for technological development focused on 6G (the sixth-generation mobile communications system).
Front-line Researchers
- Shiro Saito, Senior Distinguished Researcher, NTT Basic Research Laboratories
Abstract
A challenge with a superconducting quantum bit (qubit)—the basic element of a quantum computer—is its short lifetime. To overcome this challenge for developing a quantum computer, approaches, such as elucidating the mechanisms that affect the lifetime of a qubit and extending its lifetime by correcting errors that occur when it reaches the end of its life, have been taken. Extending the lifetime of a qubit is expected to improve the accuracy of quantum sensing. A superconducting flux qubit has two quantum states, which correspond to the direction of the superconducting current, and by controlling these states with a magnetic field, the qubit can be used as a highly sensitive magnetometer. We interviewed Shiro Saito, a senior distinguished researcher at NTT Basic Research Laboratories, who aims to apply a hybrid combination of high-performance magnetometers and biological samples to pathological diagnosis. He talked about the detection of iron ions in neurons by using superconducting flux qubits and shared his research approach that looks beyond the red ocean of a research field.
Rising Researchers
- Naomi Yamashita, Distinguished Researcher, NTT Communication Science Laboratories
Abstract
While information technology enriches our lives, those of us who use it are also required to develop literacy in it and understand its pros and cons. For example, artificial intelligence (AI) is advancing day by day, and AI-human communication is even becoming easier, but we must be careful because this could also lead to a decline in human-to-human communication. NTT Distinguished Researcher Naomi Yamashita is working to deepen communication using information technology. For this issue, we spoke to her about her research into solving the various problems facing modern society and the mindset required of researchers.
Feature Articles: Exploring the Nature of Humans and Information for Co-creating Human-centered Technologies with AI
- New Developments in Communication Science Research in the Generative AI Era—Exploring the Nature of Humans and Information for Co-creating Human-centered Technologies with AI
Abstract
NTT Communication Science Laboratories (CS Labs) is dedicated to the advancement of “heart-to-heart communication” between humans and computer systems. Our research focuses on the development of fundamental theories that explore the nature of information and humans, as well as the creation of innovative technologies that will revolutionize society. This article highlights some of CS Labs’ efforts toward the coexistence of humans and artificial intelligence (AI), taking into account the recent and rapidly advancing trend of generative AI.
- Towards Reliable Infrastructures with Compressed Computation
Abstract
Contemporary society depends on several network infrastructures such as telecommunication and transportation. Analysis of such infrastructures is essential for, e.g., designing high-performance networks and finding vulnerable network components. This analysis often necessitates considering combinations of network components such as roads and optical fibers. However, combinations result in a prohibitive increase in computational time, preventing us from performing the analysis sufficiently in a reasonable time. In this article, I introduce efficient network-analysis algorithms using a decision diagram, which represents an enormous number of combinations in a compressed form, to tackle computationally challenging network-analysis problems.
- Human-centric Image Rendering for Natural and Comfortable Viewing—Image Optimization Based on Human Visual Information Processing Models
Abstract
As display technology and devices advance, using any surface or space as a display screen is becoming possible. However, emerging technologies that use projectors and see-through displays face challenges in maintaining consistent image quality, as the appearance of the displayed image can vary significantly depending on factors such as ambient light and background patterns. The key to solving this problem is understanding how the human visual system works. In this article, I introduce an approach that addresses this issue by modeling the visual information processing of the human brain. This model enables us to optimize displayed images to ensure they are perceived as intended despite environmental variations.
- Fast Knowledge Discovery from Big Data—Large-scale Data Analysis with Accuracy Guarantee via Efficient Pruning Methods
Abstract
There is growing interest in effectively using artificial-intelligence-based data analysis. Unfortunately, analyzing large-scale data incurs excessive computation costs. While approximate methods are commonly used to reduce computation costs, they cannot yield exact results; they sacrifice accuracy to improve efficiency. This article introduces representative methods of pruning unnecessary computations for fast and accurate data analysis.
- The Crux of Human Movement Variability
Abstract
Despite intense training, even a seasoned baseball pitcher has difficulty throwing a ball to the same location repeatedly. I describe my research team¡Çs finding that such movement variability comes from muscles that activate at imprecise times and introduce a new method that uses a smartphone to robustly quantify movement variability, which is tightly related to the arm muscles¡Ç timing precision. This useful method reveals how movement variability, a measure of dexterity, changes with growth and age, and it quantifies the degree of handedness and footedness.
Feature Articles: Reducing Security Risks in Supply Chains by Improving and Utilizing Security Transparency
- Addressing Supply Chain Security Risks through Security Transparency
Abstract
NTT has launched the Security Transparency Consortium to promote research and development of technologies for reducing supply chain security risks based on the key concept of security transparency and work with various companies forming the supply chain to mitigate such risks. In this article, we will introduce international trends related to supply chain security risks, relevant technologies, and an overview of the consortium.
- Activities of the Security Transparency Consortium to Enhance the Effective Use of Visualization Data
Abstract
To promote responses to supply chain security risks using visualization data, it is crucial for the various businesses that form the supply chain to collaborate and advance the effective use of visualization data in the collaborative domain. This article introduces the activities of the Security Transparency Consortium, established in September 2023 to promote co-creation of knowledge among these diverse businesses and promote the use of visualization data. It also discusses the challenges associated with promoting the use of visualization data.
- Enhancing Software Vulnerability Management with Visualization Data
Abstract
To use visualization data effectively in software vulnerability management, it is essential to clarify its use cases. This article introduces examples of how visualization data can be used within organizational vulnerability management.
- Efforts to Improve and Utilize Security Transparency in Software Supply Chains
Abstract
Looking back on the expectations of various stakeholders for the utilization of visualization data to reduce risks in software supply chains and the actual situation in which the utilization has not progressed, we introduce the latest research trends toward addressing issues related to the use of visualization data and the security transparency technologies that NTT Social Informatics Laboratories is investigating.
Global Standardization Activities
- Standardization Trends on QoE Evaluation in ITU-T Study Group 12
Abstract
This article introduces recent standardization activities related to the evaluation of the quality of experience (QoE) of speech, audiovisual, and other new services such as object recognition for autonomous driving. The article focuses on the activities of the Study Group 12 of the International Telecommunication Union - Telecommunication Standardization Sector (ITU-T SG12), which is responsible for standardization work on performance, quality of service, and QoE.
External Awards/Papers Published in Technical Journals and Conference Proceedings
|