As computing becomes more and more ubiquitous in daily life through technologies like AI and the Internet of Things, researchers are working to develop a new generation of microchips that not only outperform current technology but are also more energy-efficient. One possible route to these microchips is using materials that demonstrate negative capacitance – that is, the ability to store a greater amount of electrical charge at lower voltages. At Berkeley Lab, a multidisciplinary team of researchers recently relied on NERSC's Perlmutter supercomputer to help develop FerroX, a new open-source 3D simulation framework. The framework has already enabled a new research capability: modeling the atomistic origins of negative capacitance in 3D at the device level, which may contribute to the faster, cheaper development of these ultra-efficient microchips. Learn more: https://bit.ly/47qeocf U.S. Department of Energy (DOE) Berkeley Lab Computing Sciences
National Energy Research Scientific Computing Center (NERSC)的动态
最相关的动态
-
Retired. #Microspace, #Robotic Future. #HumanExploration, (#NASA), …#SpaceTech, #AI #speed #Peace #Diplomacy #Defense, #Justice, #Creation, #Partnership #GlobalSecurity. #Insights #Analysis, Reviewer, Advisor
EPFL engineers have created a device that can efficiently convert heat into electrical voltage at temperatures lower than that of outer space. The innovation could help overcome a significant obstacle to the advancement of quantum computing technologies, which require extremely low temperatures to function optimally. To perform quantum computations, quantum bits (qubits) must be cooled down to temperatures in the millikelvin range (close to -273 Celsius), to slow down atomic motion and minimize noise. However, the electronics used to manage these quantum circuits generate heat, which is difficult to remove at such low temperatures.
Novel 2D device for quantum cooling converts heat to voltage at ultra-low temperatures
phys.org
要查看或添加评论,请登录
-
Optical Computing: Optical computing is an emerging field that explores the use of light, rather than electricity, to perform computational tasks. Unlike traditional electronic computers, which rely on electrical signals to process and transmit data, optical computing harnesses the properties of light for faster, more efficient processing. In optical computing, photons—particles of light—are used to carry and manipulate information. These photons can be controlled and guided through various optical components such as lenses, mirrors, and waveguides. Optical computing systems can leverage phenomena such as interference and diffraction to perform complex calculations in parallel, leading to potentially exponential gains in processing speed. One of the main advantages of optical computing is its potential for high-speed processing. Light travels at incredibly fast speeds, allowing for rapid data transmission and manipulation. Optical systems also generate less heat compared to their electronic counterparts, reducing the need for complex cooling mechanisms and improving energy efficiency. However, optical computing is still in the experimental stage, facing challenges such as the development of reliable optical components and integration with existing electronic systems. Researchers are actively working on overcoming these hurdles to unlock the full potential of optical computing for applications ranging from data processing and telecommunications to artificial intelligence and scientific simulations. As technology advances, optical computing holds promise for revolutionizing the future of computing with its speed, efficiency, and scalability.
要查看或添加评论,请登录
-
the development of computer technologies based on the principles of quantum theory. Quantum theory explains the behavior and nature of matter and its energy at the quantum level. Quantum computers can handle more than just the binary information which conventional computers operate on. Quantum computers can also handle data in between a 0 or 1 bit, which should, in turn, provide new types of simulation and calculations. In quantum computations the spin direction, which is either up or down, serves as the basic information unit which is similar to the 0 or 1 bit in a classical computing system. Electron spin can assume both 0 and 1 simultaneously, as a result of quantum entanglement, which greatly enhances the ability to perform complex computations. Quantum Computing in 2034,after 10 Years this will be the Research Quantum Nanotechnology will be one of the most Advanced Research in 2034 First,Advanced Technology with silicon based Computational Technology Next,Nanomaterials are going to swift Results. Example-Copper Nanomaterials Experiment can be done with the help of Quantum Computing. Industrial Revolution.
要查看或添加评论,请登录
-
Researchers at the Massachusetts Institute of Technology (MIT), using facilities at MIT and Harvard University’s Center for Nanoscale Systems (part of the National Nanotechnology Coordinated Infrastructure network), have demonstrated current-controlled, non-volatile #magnetization switching in an atomically thin van der Waals magnetic material at room temperature. Magnets composed of atomically thin #VanDerWaals materials can typically only be controlled at extremely cold temperatures, so the fact that the researchers were able to control these materials at room temperature is key. The researchers’ ultimate goal is to bring van der Waals #magnets to commercial applications, including magnetic-based devices with unprecedented speed, efficiency, and scalability.? https://lnkd.in/gPRnvRuT (Work funded by the National Science Foundation (NSF))
Researchers harness 2D magnetic materials for energy-efficient computing
news.mit.edu
要查看或添加评论,请登录
-
Dear All, I hope this message finds you well. Please find below the work referenced from the NFSU Goa 2022 Conference paper, along with material from our book https://lnkd.in/gfQEhs43 Deep Learning Networks: Design, Development, and Deployment. Although Physics Review did not publish this in 2022, I believe the recent spotlight on AI, particularly following its recognition in the 2024 Physics Nobel Prize, may have shifted interests. This could make the journal more receptive to revisiting our work. We modeled a Sensor Network to explore the relationship between the works of 1) Prof. John Hopfield, 2) Prof. Geoffrey Hinton, 3) Prof. Brooks-Iyengar. Our analysis ( during 2020- 2022) shows that while energy-based models in Prof. Hinton's framework may not always reach the same equilibrium when viewed through stochastic processes, Prof. Hopfield’s models tend to converge more consistently (though some local minima issues may arise). However, Brooks-Iyengar networks offer deterministic termination( conjecture from me !!!!), but with a high computational complexity due to interactions between all nodes. This complexity challenges natural parallelism on NVIDIA chipsets, potentially necessitating custom ASIC development for efficient training. I look forward to your thoughts on this. Best regards, Jayakumar S , Ph.D Systems and Control Engineering, IIT Mumbai.
要查看或添加评论,请登录
-
?? Exciting News! Syracuse University is taking a bold step towards becoming a global leader in intelligent manufacturing of semiconductors! ?? ?? Syracuse University has recently announced plans to establish a state-of-the-art Advanced Manufacturing Center, with a whopping investment of $20 million! This center will bring together experts in artificial intelligence, cybersecurity, manufacturing processes, and robotics, creating a powerhouse of knowledge and innovation. ?? ?? The world is witnessing a revolution in the field of artificial intelligence and advanced manufacturing. Generative AI, a transformative technology, has the power to generate human-like text, images, code, and more. It's almost miraculous! Syracuse University's initiative aligns perfectly with this trend, positioning the school and the region as a global leader in research and education on the intelligent manufacturing of semiconductors. ?? [1] [3] ?? The Golem, one of the earliest AI prototypes, paved the way for the incredible advancements we see today. It's fascinating to see how far we've come since then. The Advanced Manufacturing Center at Syracuse University will build upon this legacy, pushing the boundaries of what's possible in the world of AI and manufacturing. [2] ?? ?? As part of their efforts, Syracuse University will leverage cutting-edge technologies like autonomous driving dataset visualization with Python and VizViewer. This will enable them to analyze and predict autonomous driving paths, contributing to the development of safer and more efficient transportation systems. ?? ?? The center's focus on cybersecurity is crucial in today's digital age. With the increasing reliance on AI and advanced manufacturing, protecting sensitive data and ensuring the integrity of manufacturing processes is of utmost importance. Syracuse University's expertise in this area will play a vital role in shaping the future of secure and resilient manufacturing. ??? ?? Let's celebrate this exciting initiative by Syracuse University! Comment below and share your thoughts on how this Advanced Manufacturing Center can revolutionize the semiconductor industry. Together, we can shape the future of intelligent manufacturing! ?? #SyracuseUniversity #AdvancedManufacturing #IntelligentManufacturing #AI #Cybersecurity #Robotics #Semiconductors #Innovation #FutureOfWork #Education #Research References: [1] Your AI is Only as Good as Your Data: https://lnkd.in/eKEDBBh5 [2] The Golem in the age of artificial intelligence: https://lnkd.in/dgt77xQi [3] Autonomous Driving Dataset Visualization with Python and VizViewer: https://lnkd.in/dsMqpDa8
Syracuse University Plans $20M Advanced Manufacturing Center
govtech.com
要查看或添加评论,请登录
-
Great twist in microelectronics and memristors manufacturing using silk. The silk route may in fact become the standard as its usage encompasses many areas of electronics. These researchers report that they have achieved a uniform two-dimensional (2D) layer of silk protein fragments, or "fibroins," on graphene, a carbon-based material useful for its excellent electrical conductivity. It's important to note that this system is nontoxic and water-based, which is crucial for biocompatibility. This combination of materials—silk-on-graphene—could form a sensitive, tunable transistor highly desired by the microelectronics industry for wearable and implantable health sensors. The PNNL team also sees potential for their use as a key component of memory transistors or "memristors," in computing neural networks. Memristors, used in neural networks, allow computers to mimic how the human brain functions. The same underlying properties that make silk fabric world-renowned—elasticity, durability, and strength—have led to its use in advanced materials applications. The team carefully controlled the?reaction conditions, adding individual silk fibers to the water-based system in a precise manner. Through precision laboratory conditions, the team achieved a highly organized 2D layer of proteins packed in precise parallel β-sheets, one of the most common protein shapes in nature. Further imaging studies and complementary theoretical calculations showed that the thin silk layer adopts a stable structure with features found in natural silk. An?electronic structure?at this scale—less than half the thickness of a strand of DNA—supports the miniaturization found everywhere in the bio-electronics industry. Indeed, the researchers are planning to use this starting material and technique to create their own artificial silk with functional proteins added to it to enhance its usefulness and specificity. This study represents the first step in controlled silk layering on functional electronic components. Key areas of future research include improving the stability and conductivity of silk-integrated circuits and exploring silk's potential in biodegradable electronics to increase the use of green chemistry in electronic manufacturing. #climatechange #microelectronics #silktronics
2D silk protein layers on graphene pave the way for advanced microelectronics and computing
phys.org
要查看或添加评论,请登录
-
Computers are fundamental to modern life, influencing everything from personal productivity to global communication and scientific research. Here's a comprehensive overview of computers, including their history, components, types, and impact on society. History of Computers The evolution of computers can be traced back to early mechanical calculators, such as the abacus and the Antikythera mechanism. The first true computers emerged in the mid-20th century: Early Mechanical Computers: Charles Babbage's Analytical Engine, conceptualized in the 1830s, is often considered the first design for a general-purpose computer. First Electronic Computers: In the 1940s, machines like ENIAC (Electronic Numerical Integrator and Computer) and Colossus were developed, marking the transition from mechanical to electronic computing. Components of a Computer Central Processing Unit (CPU): Often called the "brain" of the computer, the CPU performs instructions from programs. Modern CPUs can perform billions of operations per second. Memory: Includes RAM (Random Access Memory) for temporary storage and ROM (Read-Only Memory) for permanent instructions. Storage devices like hard drives and SSDs (Solid-State Drives) provide long-term data storage. Motherboard: The main circuit board that houses the CPU, memory, and other essential components. Input and Output Devices: Input devices include keyboards and mice, while output devices include monitors and printers. These allow users to interact with the computer Types of Computers Computers come in various forms, each serving different purposes:. -Supercomputers: Extremely fast computers used for complex simulations, scientific research, and tasks requiring immense computational power. - **Embedded Systems**: Computers integrated into other devices, like smartphones, appliances, and cars, to perform specific functions. Future trends -Communication: The internet and email revolutionized how people communicate, leading to the rise of social media and real-time -Education: E-learning platforms and educational software have expanded access to knowledge and learning opportunities worldwide. - Healthcare: Computers are used in medical imaging, patient records management, and research. They support telemedicine and complex diagnostic processes. Science and Research: Computational power has accelerated scientific discoveries, from genome sequencing to space exploration. Supercomputers simulate climate models, physical processes, and large-scale scientific Artificial Intelligence (AI: AI and machine learning are transforming how computers process data and make decisions, impacting fields like autonomous vehicles, healthcare, and customer service. Computers are integral to our lives, continuously evolving to meet the demands of an increasingly digital world. #snsinstitutions #snsdesignthinkers #designthinking
要查看或添加评论,请登录
-
Dedicated chemist with a passion for research, innovation, and creating solutions to advance scientific discovery.
Based on the search results, here are some key examples of nanotechnology applications in computer science: Faster and more efficient computer chips: Nanotechnology enables the fabrication of transistors and integrated circuits at the nanoscale, allowing for more transistors to be packed into a smaller area. This leads to faster, more powerful, and more energy-efficient computer chips. Improved data storage: Nanomaterials like carbon nanotubes and quantum dots can be used to create high-density data storage devices with much greater storage capacity than traditional silicon-based storage. Nanophotonics for data transmission: Nanophotonic devices that use surface plasmons (hybrid light-electron oscillations) can enable faster and more efficient optical data transmission on computer chips. Nanocomputers and nanorobots: Researchers are exploring the potential of nanoelectronic devices and nanoscale robots to create ultra-small, powerful computers and sensors. Quantum computing: Quantum computing using qubits based on electron spins can provide much faster processing speeds and the ability to handle larger data sets compared to traditional computing methods. Overall, nanotechnology is enabling significant advances in computer hardware, data storage, communication, and even new computing paradigms like quantum computing that are crucial for continued improvements in computing power and efficiency. part2
要查看或添加评论,请登录
-
Plasma scientists develop computer programs that could reduce the cost of microchips and stimulate American manufacturing: Fashioned from the same element found in sand and covered by intricate patterns, microchips power smartphones, augment appliances and aid the operation of cars and airplanes. Now, scientists are developing computer simulation codes that will outperform current simulation techniques and aid the production of microchips using plasma, the electrically charged state of matter also used in fusion research. These codes could help increase the efficiency of the manufacturing process and potentially stimulate the renaissance of the chip industry in the United States. #ScienceDaily #Technology
Plasma scientists develop computer programs that could reduce the cost of microchips and stimulate American manufacturing
sciencedaily.com
要查看或添加评论,请登录