A computer predicts your thoughts, creating images based on them

By monitoring brain function, computers can be made to imagine what a person is thinking of and present the results as images. The technique can be utilized in psychology and cognitive neuroscience, as well as supporting human creativity.

Researchers at the University of Helsinki have developed a technique in which a computer models visual perception by monitoring human brain signals. In a way, it is as if the computer tries to imagine what a human is thinking about. As a result of this imagining, the computer is able to produce entirely new information, such as fictional images that were never before seen.
The technique is based on a novel brain-computer interface. Previously, similar brain-computer interfaces have been able to perform one-way communication from brain to computer, such as spell individual letters or move a cursor. As far as is known, the new study is the first where both the computer’s presentation of the information and brain signals were modeled simultaneously using artificial intelligence methods.
Images that matched the visual characteristics that participants were focusing on were generated through interaction between human brain responses and a generative neural network.

US Space Force To Use Blockchain For Data Security Systems

Blockchain Firm Xage Security was awarded a contract by the United States Air Force (USSF) and the U.S. Air Force Research Lab to develop a system for data security.

The Silicon Valley cybersecurity firm won the contract worth $743,000 to help the Space Force in securely connecting military and civilian satellites with ground stations in tamper-proof networks, said Space News. This will be done by employing the company’s Xage Security Fabric solution. Xage Security Fabric can further secure the systems by removing single points of entry. In this way, hackers would not be able to wipe information, Cointelegraph reports.

Through Xage’s unified platform, the Space Force will be able to verify the parties and people accessing its system and make sure that even if ground equipment goes offline, the satellites will continue to function. At the same time, all data would be fully protected until it is okay for those data to get transferred.

According to Xage’s press release, the Space Force requires decentralized enforcement of security to ensure space domain resilience, and as such, they have built its own solution to serve the needs of complex infrastructure systems.

“We are excited to bring the Xage solution to the Space Force in the form of a blockchain-protected space system security,” said Duncan Greatwood, Xage’s CEO. This is Xage’s second contract with the Air Force. The first one was signed in 2019 where the Air Force evaluated the company’s blockchain-protected Security Fabric.

Source: International Business Times

 

Engineers imitate human hands to make better sensors

 

UNIVERSITY PARK, Pa. — An international research team has developed “electronic skin” sensors capable of mimicking the dynamic process of human motion. This work could help severely injured people, such as soldiers, regain the ability to control their movements, as well as contribute to the development of smart robotics, according to Huanyu “Larry” Cheng, Dorothy Quiggle Early Career Professor in the Penn State Department of Engineering Science and Mechanics.

Cheng and collaborating researchers based in China published their work in a recent issue of Nano Energy.

“The skin of the human hand is amazing — that’s what we tried to imitate,” Cheng said. “How do we capture texture and force? What about the years of evolution that produced the impressive sensitivity of the fingertip? We’re attempting to reproduce this biological and dynamic process to enable objects to behave similarly to the human hand.”

The dual-mode sensor measures both the magnitude and a load of movement, such as the effort of swinging a tennis racquet, as well as rate, duration, and direction. The trick was to decouple this measurement and understand how the separate parameters influence each other.

For example, bouncing a tennis ball gently on a racquet requires different input than serving a ball to an opponent. Those same variables come into play when a person with a prosthetic arm needs to differentiate between handling an egg or carrying a watermelon.

News Source: PennState

 

A new study allows Brain and Artificial Neurons to link up over the web

Research on novel nanoelectronics devices led by the University of Southampton has enabled brain neurons and artificial neurons to communicate with each other. This study has for the first time shown how three key emerging technologies can work together: brain-computer interfaces, artificial neural networks and advanced memory technologies (also known as memristors). The discovery opens the door to further significant developments in neural and artificial intelligence research.

Brain functions are made possible by circuits of spiking neurons, connected together by microscopic, but highly complex links called ‘synapses’. In this new study, published in the scientific journal Nature Scientific Reports, the scientists created a hybrid neural network where biological and artificial neurons in different parts of the world were able to communicate with each other over the internet through a hub of artificial synapses made using cutting-edge nanotechnology. This is the first time the three components have come together in a unified network.

During the study, researchers based at the University of Padova in Italy cultivated rat neurons in their laboratory, whilst partners from the University of Zurich and ETH Zurich created artificial neurons on Silicon microchips. The virtual laboratory was brought together via an elaborate setup controlling nanoelectronic synapses developed at the University of Southampton. These synaptic devices are known as memristors.

News Source: University Of Southampton

 

Security software for autonomous vehicles

Before autonomous vehicles participate in road traffic, they must demonstrate conclusively that they do not pose a danger to others. New software developed at the Technical University of Munich (TUM) prevents accidents by predicting different variants of a traffic situation every millisecond. A car approaches an intersection.

Algorithms that peer into the future

The ultimate goal when developing software for autonomous vehicles is to ensure that they will not cause accidents. Althoff, who is a member of the Munich School of Robotics and Machine Intelligence at TUM, and his team have now developed a software module that permanently analyzes and predicts events while driving.

Streamlined models for swift calculations

This kind of detailed traffic situation forecasting was previously considered too time-consuming and thus impractical. But now, the Munich research team has shown not only the theoretical viability of real-time data analysis with simultaneous simulation of future traffic events: They have also demonstrated that it delivers reliable results.

Real traffic data for a virtual test environment

For their evaluation, the computer scientists created a virtual model based on real data they had collected during test drives with an autonomous vehicle in Munich. The computer scientist emphasizes that the new security software could simplify the development of autonomous vehicles because it can be combined with all standard motion control programs.

News Source: Technical University Of Munich

Upcoming Events

There are no upcoming events at this time.

Subscribe for Newsletter

Related Posts

Leave a Reply