Wolfram|Alpha: The Most Intuitive Experience for Working with Data Science and Neural Nets
Mark Braithwaite from Wolfram Research Europe speaking at Predict Conference

Wolfram|Alpha: The Most Intuitive Experience for Working with Data Science and Neural Nets

This article is an interview with Mark Braithwaite from Wolfram Research Europe. Mark uses the Wolfram Language daily as a development tool for company and customer applications. He is the author of several Wolfram U courses and gives talks and technical demonstrations on applications of the Wolfram Language in machine learning, AI and data science. Mark spoke at Predict Conference on October 1st 2019.


Mark, could you give us a quick intro to what Wolfram|Alpha does?

Wolfram|Alpha is our computational knowledge engine. Consider Google and Yahoo. If you put in a question, they will give you a list of websites which they think are most likely to answer your question. With Wolfram|Alpha, if you ask a question it will give you an answer. Your question might be a complex equation or the weather in Rome. It will actually provide you with an answer, computed in most cases, from our database of knowledge.

We also have a business version that works by being plugged into companies' data. So you might ask, "What did my colleague John do on Friday?”. Wolfram|Alpha will then give you a list of all the documents they've worked on that day, including highlighting the edits. People at CEO or CTO level can ask quite complex questions about financial predictions or whatever and actually get accurate answers.


What is Wolfram’s involvement in A.I. today?

Our story has always been about automation and integration of computation. That approach has not changed whatsoever. Continuing that storyline, we've approached A.I. from two different directions. The first one is trying to make A.I. accessible to everybody, where we've gone and focused on automating the library of neural networks we have. We have over 50 neural networks, all built by experts. But if you're just getting into A.I., that's quite an intimidating thing. So we've gone and worked on automating the access to those neural networks to the point where, if you just want to drag and drop an image from your favourite search browser and classify it, you can. It should be really simple.

The other direction is if you are an expert, you're going to want to look at the tools and integrated workflows that we provide in order to try and make the manipulation and creation of neural networks much easier. Your workflow is quicker and simpler. Those two approaches, when combined with the integrated nature of the language, really do provide some interesting capabilities.

Our main involvement then has been trying to provide these expert-level capabilities, whether it be tools or neural networks. They are defined and devised by experts. We are trying to make them easier to access and use such that it doesn't matter whether you're just starting out or you're an expertyou should be able to achieve something useful in a few hours at most.

No alt text provided for this image


What exactly are the capabilities of the Wolfram Language and how do they differ from, say, Python with Pandas?

I mentioned that our capabilities stem from 50-plus expertly crafted neural networks. We developed a framework to support them so that they're easy to import into the language and use directly with one line of code. We can do things as simple as classifying something or extracting features from complex datasets. They also do semantic segmentation of data all the way up to complex speech recognition.

We provide tools that allow experts to access layers of a neural network and customize them to exactly how they need. That might include the connecting up of layers or changing individual weightings.

It doesn't matter exactly how you're representing your data as long as you can provide it to the A.I. It should then be able to retrieve your data semantically and interpret it or have a sensible way of converting you data into something which it can use, normally using one of a variety of feature extraction networks.

If you wish to then combine, say, feature extraction with time series functionality and processing, you can. If you prefer complex analysis of huge datasets, it should be just a few lines of code. You can directly connect to the database, easily allowing you to bring only the data you need into memory, rather than having to pull a huge amount of data and comb through it. We also have support for things such as the correlation of data to really find out if there is any relation to your data points and what they actually mean for you.

Because of our integrated nature of our language even things such as the results or the processes that are ongoing can be easily visualized to help individuals understand what's going on and what they've actually achieved at the end of it.


Tell us about some projects you have worked on using Wolfram technologies?

We have a capsule colonoscopy project. The technology is a tiny little capsule with a camera which records video as it goes through your digestive system. The idea is to use machine learning to identify areas of interest in a video. Footage might last six hours, so we cut it down to smaller 10-minute bites relevant to identifying cancer cells. 

Another project we have ongoing is predictive maintenance of wind farms, because the components for these wind turbines are incredibly expensive and hard to store. We can predict maintenance needs from both the sensor data and the images taken by a helicopter at a distance. The result of that particular project was a surprisingly accurate A.I. which was able to provide a brilliant prediction for when maintenance was needed, to the point where they could send out an individual before anything broke. 

要查看或添加评论,请登录

Cronan McNamara的更多文章

社区洞察

其他会员也浏览了