2-Min AI Newsletter #15

2-Min AI Newsletter #15

???? ?? Latest AI/ML Research Highlights?

???Meta AI Open Sources AITemplate (AIT), A Python Framework That Transforms Deep Neural Networks Into C++ Code To Accelerate Inference Services

Meta AI has created?AITemplate (AIT), a unified open-source inference solution with distinct acceleration back ends for AMD and NVIDIA GPU technology, to address these industry difficulties. On a range of popular AI models, including convolutional neural networks, transformers, and diffusers, it provides performance almost identical to that of hardware-native Tensor Core (NVIDIA GPU) and Matrix Core (AMD GPU) architectures.

???Google answers Meta’s video-generating AI with its own, dubbed Imagen Video

Imagen Video builds on Google’s?Imagen, an image-generating system comparable to OpenAI’s?DALL-E 2?and?Stable Diffusion. Imagen is what’s known as a “diffusion” model, generating new data (e.g. videos) by learning how to “destroy” and “recover” many existing samples of data. As it’s fed the existing samples, the model gets better at recovering the data it’d previously destroyed to create new works.

???Amazon Open-Sources ‘MINTAKA,’ a Complex, Natural, and Multilingual Question-Answering (QA) Dataset Composed of 20,000 Question-Answer Pairs?

Researchers at amazon have released datasets for complex and multilingual question answering. With 20,000 questions collected in English and professionally translated into eight languages—Arabic, French, German, Hindi, Italian, Japanese, Portuguese, and Spanish—Mintaka is a sizable, complex, naturally occurring, and multilingual question-answer dataset. By connecting elements in the question and response text to Wikidata IDs.

???Amide library created at speed with machine learning and stopped-flow chemistry

A team of scientists based in Sweden and the UK has developed a synthetic screening method that uses stopped-flow chemistry and machine learning to accelerate drug discovery through diversity-oriented synthesis

???AI-Generated Joe Rogan Chats Up Steve Jobs Over His Use of LSD, Spat With Gizmodo

Podcast.ai generated a fake audio recording using artificial voices and language model transcripts based on Rogan and Jobs’ old public speeches and keynotes.

??MIT And IBM Researchers Present A New Technique That Enables Machine Learning Models To Continually Learn From New Data On Intelligent Edge Devices Using Only 256KB Of Memory

A new study by IBM and MIT researchers investigates the possible small on-device training methods through the collaborative design of algorithms and systems. As we delve deeper into micro-on-device training, we discover two distinct hurdles: The researchers address two major issues in a model that is quantized on the edge devices, and the little memory and processing power of microcontrollers prevent full back-propagation.?

????Cool AI Papers For This Week

???Bird-Eye Transformers for Text Generation Models

??FLamby: Datasets and Benchmarks for Cross-Silo Federated Learning

??NerfAcc: A General NeRF Acceleration Toolbox

??Boosting Out-of-distribution Detection with Typical Features

??Invertible Rescaling Network and Its Extensions

要查看或添加评论,请登录

社区洞察

其他会员也浏览了