Billionaire Battle in the Fluid Industry:

The arrival of the Synthetic model

Billionaire Battle in the Fluid Industry: The arrival of the Synthetic model

Introduction

A billion-dollar battle is approaching! The scenario: the fluid industry. It involves innovation, cultural aspects, model change, team building.

Automation has changed the face of the industry. Digital Transformation (Industry 4.0) will drive the next big change. The fluid industry was left out of the first major change and is threatened with being excluded from the second as well. To avoid this, you need to change the analytical model of decision support adopted so far.

The fluid industries invest hundreds of billions of dollars every year in research, development and innovation. A technology that can deliver digital transformation to these industries is a strong competitor for these resources. The applications derived from it should become the winners of this battle for the sector's R&D&I investments.

N?o foi fornecido texto alternativo para esta imagem
Battle for hundreds of billions of dollars of fluid industry R&D&I.

It has occurred before in Medicine, when analytical diagnoses (laboratory results) were losing preference for imaging diagnoses (ultrasound, magnetic resonance imaging, computed tomography). We all know the impact of this novelty on our lives: it eliminated the need for surgical interventions and invasive tests. Doctors who have specialized in new technologies have upgraded their careers.

The battle of fluids will be similar. It will cause changes in costs, speed and volume of production, quality of products. As in medicine, a gradual migration from the current scenario to the future will be inevitable.

When we talk about fluids we are talking about a variety of industries: beverages, pharmaceuticals, cleaning, cosmetics, chemicals, fuels, oil & gas.

The synthetic model is a good candidate to be the logical-mathematical basis of this new technology. Uses advanced AI to treat fluid complexity and deliver speed, process knowledge and homogeneity of actions.

N?o foi fornecido texto alternativo para esta imagem


I will examine the case of startup XMachina, whose technology follows the synthetic model, as well as its experience in the fluid industry from various perspectives: cultural difficulties, technological obstacles, market geography. We will see how the AI used by her talks to the requirements of the synthetic model.

The goal with the adoption of the synthetic model is to eliminate bottlenecks in production processes such as: reprocessing; asset wait time for lab analyses; product subclassification; losses; and wide specification margins.

Analytical Model

The fluid industry is still dependent on quality laboratories as in the first half of the 20th century.

To understand the dependence of the quality laboratory, we must look at two aspects that have created obstacles to the introduction of new automation technologies:

Submission to the analytical model - several characteristics are analyzed (not all of them) and then decisions are made (human or automatic - in the case of closed loops);

Fluid complexity - some involve hundreds of characteristics to be monitored (such as drinking water delivered to the population).

Over the last eighty years, largest technology companies have not been able to advance in the automation of the process of this industry, and the main cause is the complexity involved. The more we walk from pure fluids towards compounds or mixtures, the more complexity increases.

The obstacle on the way: combination of continuous series.
N?o foi fornecido texto alternativo para esta imagem

What, after all, is the difficulty for the automation of the fluid industries? A fluid, in general, is a compound. 100% pure products are the result of high-cost processes, in most cases, on a small scale. The various characteristics of fluids vary linearly from a minimum to a maximum. In this interval, the number of occurrences depends on the accuracy you want to adopt. As the accuracy can increase indefinitely, we have, for practical purposes, a continuous series. With many characteristics involved, we arrived at a combinatorics of several continuous series.

If we want to control the presence of chlorine in water between 0 and 1%, with an accuracy of 0.1% we have ten possible occurrences; if we increase the accuracy to 0.01%, we have one hundred occurrences; and so on.

The number of intervals to monitor increases a lot when we increase accuracy!

Fluid quality depends on available technology! The more precise the technology, the smaller my specification margin and the higher the quality of my product. This precision, however, cannot greatly increase the analysis time. There is a limit to which having more accuracy in exchange for more analysis time nullifies the benefits.

If we multiply the difficulty of controlling a characteristic by tens or hundreds of them, necessary in some fluids, we can visualize the magnitude of the problem. The traditional solution is: to reduce the characteristics to the most significant set for the market and despise the others.

The market defines the margin of each plant: if the product is well received by 90% of potential consumers, success is guaranteed. The costs to reach the other 10% may be prohibitive. The solution may be to launch premium products by controlling these other characteristics, with much higher price and profit margin. Or leave these niches to specialized industries.

Line analyzers and equivalent NIRs took monitoring to process lines. Dynamic sensors have advanced a lot: thermostats; pressure gauges; flow meters (including ultrasound); colorimeters, turbidity meters; conductivity meters. Why have advances in sensors and line analyzers not solved these problems? Fluids are complex with many characteristics, and there is no necessary instruments to monitor all of them real time.

In recent decades, laboratory instruments have advanced incrementally, become faster and more accurate. But the dependence on quality laboratories continues.

This dependence comes from two complicating factors: waiting time and decision-making.

Important consequences of the Wait Time complicating factor:

N?o foi fornecido texto alternativo para esta imagem


Dead times - Dead times - in the different phases of a batch, after the collection, we have to wait for the laboratory results to release the following phases. This reduces asset productivity, sometimes with energy expenditure. When the phase is controlled by a fixed time, the safety margin implies the loss of precious minutes that, when added to the hundreds, sometimes thousands of batches/year, become a significant piece of data;

Wide specification margin - we need to extend the specification margins to adjust them to the time of laboratory analysis, with consequent reduction in the quality of the products;

Excessive use of inputs - we need to dose more inputs to ensure complete reactions. This generates, in addition to the cost, a greater amount of waste in the process and problems with the preservation of the environment;

Delayed actions - acting from analyzes from 30, 40 or 50 minutes ago, implies acting on a past fluid, which may no longer exist, implying loss of products, sub-classification and/or reprocessing.

Important consequences of the Decision-making complicating factor:

The decision-making method is associated with the analytical model: it is necessary to gather several analyses with the online data and decide what to do in each situation. The combinatorics can be great. Two solutions are available today: panel operators and automated algorithms.

Decision-making, from operators, is born from two factors that are not always harmonious: training and experience.

People make decisions when they overcome their doubts and achieve certainty. Different operators sometimes take different actions in the face of the same situation.

Consequences derived from Decision-making by people:

Different actions for equivalent situations - direct consequence of different experiences and capabilities. Speed and strategy to reach a decision varies from operator to operator;

Different energy cost for equivalent situations - energy consumption depends on the modus operandis of the process controllers. Different ways of operating imply different energy expenses. Detecting these differences is not always easy for managers;

Different risk minimization actions - different operators tend to prioritize diverse actions in complex situations that involve claims risks;

Different stock hierarchization - when more than one of the specifications are operating outside the specification, the necessary adjustments need to be prioritized and different operators may have different priorities.

It is necessary to emphasize the difficulty for managers to define what equivalent situations are. They are not present throughout the operation and the complexity of each of them allows several analyzes, all of them justified, depending on the perspective adopted.

Decision tree is the most widely used AI tools for automation. They internalize human knowledge. It's as good as the experts who helped in its construction.

Some consequences of Decision-making by algorithms:

Crystallization of the operation - automation will be built with the existing knowledge at the time of its development. New knowledge will require changes that imply new R&D;

Greater risks in unprecedented situations - situations that have never happened before, due to their novelty, may have wrong decision-making or non-decision-making because they are not provided for in the model;

More costly optimizations - operation optimizations may require complementary R&D projects, which, in general, entail high costs;

Reduction of learning - the use of algorithms tends in the medium and long term to promote a decrease in the expertise of the operation due to the withdrawal of operators from decision-making.

Synthetic Model

When a new technology arrives that allows higher quality of products with the same or lower costs, market and survival may be threatened.
N?o foi fornecido texto alternativo para esta imagem


The moment of a disruptive innovation has come: the analytical model, which we analyzed above, can be replaced, with advantages, by the synthetic model. It goes from a chain of decision such as:

IoT Data —> Data analysis —> Taking action,

For a decision chain of the type:

IoT data —> Synthetic diagnosis —> Decision making.

What is the synthetic model after all?

It needs to arrive at diagnoses eliminating human decision-making and providing the following characteristics:

Equal actions in equivalent situations - replace the complex analysis process with immediate decision-making from primary data. It must, through real-time diagnosis, allow immediate corrections that avoid gaining complexity of the problems, regularizing the process as a whole and not specific aspects of it;

Equal energy cost for equivalent situations - support the inclusion of energy cost as a variable of the process to be controlled and minimize it;

Minimization of risks - be predictive and take actions that remove the risk of claims, even unprecedented ones;

Systemic actions - eliminate hierarchization, from a systemic and synthetic view of the process.

It should replace knowledge-based algorithm models with adaptive artificial intelligence and adopt the following characteristics:

Continuous improvement - adapting to the state of each plant and accompanying the increase in product quality;

Reduction of unprecedented risks - have predictability, preventing unprecedented situations from occurring with immediate corrections of the departure from requirements;

No-cost optimizations - dispense with R&D optimizations once they optimize yourself;

Knowledge generation - provide operation and quality teams with knowledge about process and materials under their management.

These requirements can be used for any automation project in the industry, not just fluids.

I said above that the fluid industry has had a considerable automation challenge. Wouldn't it be a utopia, then, to solve this delay and advance towards digital transformation at the same time? The answer is to eliminate the main challenge: the complexity of fluids and what I called continuous series combinatorics.

The combinatorics of continuous series is directly linked to the analytical model. This model understands that the way to produce quality is to control each variable involved. The synthetic model understands, in contrast, that it is to control the whole.

We need a technology that works like our brain. To be more exact, like that of a police sniffer dog, of those who stay at the airport sniffing drugs. The learning of this dog takes place in the presence of the drug. The dog doesn't know anything about formulas, but understands what cocaine is, for example. He learned in contact with her. We want to build solutions that decide whether or not the fluid is good for the whole.

Think you're in front of two orange soft drinks. To find out which one is best you can look at the formula on the label (analytical method) or try both (synthetic method). When you want to see if a soft drink is good, what method do you use?

This may be our problem-screen, the one that serves as a beacon for what we want to achieve. In order to have a technology in the synthetic model, some requirements must be observed:

Operate from primary data - receive the set of raw signals from the various sources and process them as a whole and not one at a time and thus circumvent the complexity of the fluids;

Do without models - regardless of metrics and interpretations about them. The goal is to achieve a desired standard at each stage and correct the course of the process continuously in order to obtain the best final product;

Being universal - being able to treat various fluids without having to be adapted to each one. After all, we can both choose the best orange soft drink, as well as the best espresso and the sniffer dog can identify both marijuana and cocaine;

Allow verticalization - allow building vertical series of engines with the output of some being the input of others;

Possess predictability - needs to identify new situations in the same way as those already known;

Apprehend singularities - identify each fluid in its singular characteristics. Understand, for example, in a blind test, that we are faced with two wines, but that one is different from the other;

Build abstractions - group patterns into sets and recognize them as a group. We are able, for example, to distinguish whether we are facing a wine or a grape juice.

With a technology capable of operating on primary data, we don't need to teach what we know about this data. She will be able to learn by herself. It will be universal. We can deliver the data from one system to another, vertically. If it is able to operate without models, it can predict unprecedented situations.

An industry operates with available technologies. Given the advantages presented by the synthetic model over the analytical model, the fact that the industry continues to operate with the analytical model may indicate that this technology did not yet exist. The good news is that it already exists and has been put into action in some important industries.

Next, I will present the first artificial intelligence technology platform that operates with the synthetic model I know, and examine the roadmap of our experience in the market.

XMACHINA'S EXPERIENCE IN DEPLOYING INTELLIGENT INDUSTRY 4.0 IN THE FLUID INDUSTRY

N?o foi fornecido texto alternativo para esta imagem


In the years 2008-9, I elaborated, based on a reductionist logical-mathematical model for the functioning of brains, an unprecedented neural network, based on elementary particles that I called Neurologs. I consider it the first existing operational Non-Turing platform. This network has an important difference from the previous ones: it replaces learning with adaptability. Learning implies assimilation of previous knowledge. Adaptability implies the ability to adjust to any new knowledge. This makes it possible to use an apparatus available in these molds for different purposes. The smell of the sniffer dog, for example, to identify different drugs or our sense of smell and taste to differentiate different drinks.

Faced with any set of signals, this network adapts and is able to issue alerts of the distance from what it is seeing to what it has adapted to. In other words: it meets some of the above requirements for a synthetic automation model. It's our sniffer dog.

How do we achieve this universal characteristic? We create a normalization layer that transforms any signal into an abstraction that we call stimulus. That is, regardless of whether we are receiving the signal from a thermostat, a flow meter or line analyzer, when passing through this layer, they lose their quantities and maintain only their qualitative differences. From then on, it is impossible to determine who is what.

Why we did it like this: because in the brains that's how things happen! It is impossible to distinguish whether a particular stimulus comes from the eyes, ears or taste.

A network of this nature can be applied to any system, be it credit analysis, stock portfolio, health, equipment predictive. This technology was the basis for the foundation of an AI startup called XMachina.

N?o foi fornecido texto alternativo para esta imagem

We put together a handful of neurolog neural networks, their normalization layers, a dashboard with various functionalities and a data API and built a learning engine that we call ARIN (acronym for Artificial Intelligence).

N?o foi fornecido texto alternativo para esta imagem

The first industry we chose to serve was fluids. It is unattended due to the complexity of the fluids and we have a disruptive technology, excellent for treating complexities. It is very difficult for a startup to enter the operational part of industries. But we had an AI solution that no one had and an industry that lacked automation.

Anyone who has created a startup knows about a childhood disease of innovation: arrogance. We believe that an applicable disruptive technology was enough and we were going to win the market. We disregard important aspects such as: a culture accustomed to the analytical method; specificity of the Brazilian market as a buyer of technology and in need of R&D; final products well received by customers in current quality. Above all, we underestimate the fact that large technology companies have been in the automation market for years and have not been able to cope with this problem. We would pay a price for this arrogance.

We had AI and the Market, we needed to choose the IoTs we would use to make our Go-to-Market.
N?o foi fornecido texto alternativo para esta imagem

Inspired by medicine and its migration to imaging diagnosis, we have created a scanning IoT that is based on light (refraction and diffraction). When light passes through a fluid, it registers its identity through the changes it undergoes in its propagation. The TOR (acronym for Optical Refractive Transducer) is a set with a transparent tube through which the fluid passes, a Led light emitter (white, infrared and ultraviolet), an image sensor that works at 30 frames/second and a mini CPU that preprocesses the image. This pre-processed data is delivered to ARIN.

Our first solutions were based on it. At the beginning of the pandemic, however, caused by a partner, a large technology multinational, we did a Proof of Concept (POC) to differentiate Covid-19 coughs from non-Covid-19.

In this POC, we demonstrate all the versatility of our AI: in fifteen days we migrated from video to audio successfully: 80% accuracy.

At that time, we had the possibility to connect both image and sound IoTs to ARIN. We move forward on two fronts:

A POC, successfully completed, at a mining company, to plot the ore pulp density curve (initial accuracy of 0.5 g/cm3 and then 0.1 g/cm3) - in this case, we could not use TOR because it was a 5-inch pipe with a polymer core. We chose to use a microphone and two accelerometers. Once again, the versatility of our and AI has been proven: we integrated accelerometers in about two weeks. It was our first operational experience with market IoTs;

A POC in our laboratory to differentiate voices from different people and altered states of the same person - we successfully completed this proof of concept into three weeks.

Validation

Our technology is able to tell if a fluid is the same, or different, from the one to which our network has adapted, and how far it is from it. That is, we can give a `Go - No Go` to process engineering. This is very valuable to her. Our first value proposition, however, was to replace laboratory analysis with ARIN diagnostics.

First misconception: instead of `Go - No Go`, we proposed a `No Go - Why? `.

In addition, our initial proposal left the whole process to our care. Which ended up including all the digital transformation. This built in: training, consulting, engineering. Instead of delivering an advanced AI platform and empowering the customer's team in its use, we set out to develop the entire solution for the customer. This involved transferring process technology to our team, consulting to their team and R&D. In the ticket that Brazilian industries are willing to pay for a startup innovation project meant touching projects at a loss.

Second misconception: do R&D for the customer with the ticket available for startups.

Another misconception was to assume that all aspects (characteristics) of fluids are controlled on a production line. They're not! Fluids are complex. The characteristics that an industry needs to control are only a subset of the total characteristics. There is a kind of agreement of the industry and its market that determines which characteristics must be specified so that a given fluid meets the requirements of a subsequent industrial process or the taste of the final consumer. Our technology, however, according to the synthetic model, informs the removal of all characteristics. Even the unspecified ones!

This usually works with high purity fluids, but can generate important problems in others. For example: detection of variations outside the required specification list. Which are not of interest, either to our customers or to the market they serve. This put in check, in some customers, the value we wanted to deliver: the liberation of the processes independent of the analyzes of the quality laboratory.

I will detail: suppose that a process includes a batch reactor; that this process, at the end of the batch, needs a collection followed by laboratory analysis, which together require an average of 50 minutes of waiting; that only about 2 to 3% of the analyzes indicate some specification problem. Therefore, a real-time system that indicates GO with 100% accuracy will greatly reduce reactor waiting time, collection and analysis costs and, eventually, energy expenses. To be more exact, 97 to 98% less analysis.

The problem is that the further we move away from pure fluids towards mixtures, we can find characteristics that alter the refraction and/or diffraction of the fluid, but do not matter. In this case, we will have detected real variations of the monitored fluid, but that, in view of the product specifications, will be seen as false negatives. The amount delivered will not be, for example, 97 to 98% of batches released, but a smaller number. It is necessary to evangelize, convince that this lower performance can turn into competitive advantages in the market and that even if the possible reduction of analysis is not achieved, a significant reduction will be obtained.

Third misconception: to imagine that controlling the entire complexity of a fluid is a net and certain value for the industry.

You may think: in compensation, knowledge was generated about the process that was not had before. This knowledge can be used to deliver to the market a higher quality or even premium product, with higher added values. This is accurate, but it finds as an obstacle a characteristic that we have already mentioned: our industry, with rare exceptions, does not have a culture of R&D and innovation. If our startup operated in central countries, this problem of value perception would perhaps be seen as an important competitive advantage.

In addition, false positives can avoid very serious situations with important claims. I will address two well-known emblematic cases and a real one of one of our customers:

Backer case - a leakage of the coolant to the beer produced caused damage and deaths in a number of customers and the closure of the factory for an extended period. Note: Backer's beer was specified, so they didn't put on the market a bad product compared to these specifications. It was just not foreseeable that this leak would happen. If our solution had been implemented in Backer, it would have detected the problem and all damage to people and the company itself would have been avoided;

Cedae case - a bad smell in the water delivered, generated fear and insecurity in the population. It took weeks for the problem to be solved. Again, if our solution had been applied throughout the water treatment process, the problem would have been detected at its origin, its cause, immediately, known and solved, avoiding prolonged malaise and wear and tear on the company's image;

Separation Column Case - in one of our customers, ARIN pointed out a deviation in the quality of the product that came out of a separation column. All tests indicated a specified fluid. The quality laboratory decided to do additional tests with the use of a chromatograph. The result showed the presence of a substance (in this case paraffin) that was not foreseen at that stage of the process. Detected, the problem has been fixed quickly.

The ability to avoid claims of our technology can make a difference by avoiding loss of life, illnesses and both material and intangible damage.

A super-specified fluid can be seen as a problem for meeting production goals, but in the medium and long term it will be recognized in the market, it can get a better price, enhance the brand, increase its share and profit margins.

Industries have specific areas to solve failures when their products are used in specific applications by other industries. These failures can be caused by these `ghost` deviations. A super-specified fluid can eventually make this department dispensable. This, however, is something that will only have an impact on future budgets.

Our biggest misconception was to try to adapt a solution from the synthetic model to analytical culture. We can link it to the childish arrogance suffered by startups.

Pivoting

It was our turning point: we needed to increase the reach of our technology and pivot our value delivery.

To increase the reach of technology, we have adopted some strategies:

Specification ranges - consists of creating patterns for consecutive intervals of a certain characteristic. Often, in fluids with a significant degree of purity, we achieve complete replacement of the quality laboratory. An interesting example is the calibration that we perform in the TORs: successive contaminations of demineralized water with 10% chlorinated solutions, identified with an accuracy of 0.01%;

Multiple lights (white, infrared and ultraviolet) - combination of lights together with three TORs to separate various effects from refraction and diffraction and eliminate transparency between two or more characteristics;

Digital signatures - association of a certain characteristic with specific light frequencies. This development is based on the creation of digital signatures for voice that we use in the Covid-19 POC and on the differentiation of voices and their states.

Of course, a solution that completely replaces the quality laboratories of any industry would have a gigantic scalability.

The digital signature, in particular, can expand the range of values we offer our customers by adding analysis to synthetic diagnostics.

While we wait for these developments, we have restricted our `No Go - Why?` value offering to application in high purity fluids. We pivot to a business model in tune with the needs of the industry. We continue to offer Consulting and Engineering activities, but as additional services. Our main offer is our AI Platform.

We also pivot the values to be delivered, based on our experience with our initial customers (Deten; Ipiranga; Elekeiroz; Oxiteno; Suzano; Ocyan; Anglo American):

Real-time fluid visualization - exchanges ideal process models for real-time monitoring in our dashboard;

Virtual R&D team in the process line - allows Operation and Quality teams to take innovative actions and monitor real-time results, verifying advances in relation to the current situation;

Quick addition of intelligence to existing systems - adds AI to primary data from existing IoTs eliminating time-consuming engineering and IT projects;

Input quality control - allows monitoring the quality of inputs received, allowing prompt alerts for the identification of unprecedented situations or others already identified before;

Reaction time control - allows to replace the reaction and sanitization clock time with real-time monitoring of them with detection of their endings. This eliminates dead times and gains precious minutes in the phases of the process;

Real-time synchronization analytics - identifies synchronies of events at different points in the process, identifying probable causes of chronic problems;

Predictability - Issues alerts, regardless of the specification, every time monitoring indicates departure from the specified standard, avoiding serious problems with products or claims;

R&D of inputs - allows to vary quantities of inputs and injection times and monitor real time the performance of the reaction enabling the improvement of quality and reducing waste generated by overdoses.

We also move in the direction of selling products instead of projects. We launched our first two shelf products:

N?o foi fornecido texto alternativo para esta imagem

Reactor Soul - allows to control real-time reactors, saving time in the reaction phases, better use of inputs and less dependence on sensors and laboratory analysis. In addition, we help teams increase product quality by reducing compliance margins. There are 22,000 reactors in Brazil alone, all presenting the same opportunities. The product brings together a virtual R&D team that makes it possible to create new premium products in the process line;

N?o foi fornecido texto alternativo para esta imagem

Distillation Soul - allows to produce purer fluids by the separation columns. Each chemical industry has between three and six of these columns. Often, it is not only a matter of increasing the quality of the final product, but preventing a noble and more valuable fluid from mixing with others or with waste.

Conclusion

Replacing the analytical model with the synthetic model promises to take away the delay of the fluid industry in the field of automation and take it straight to the era of digital transformation. It will promote a small revolution in the way of producing, production costs, product quality and environmental protection.

The synthetic model is a good candidate for winner in the battle for the hundreds of billions of dollars invested every year in research, development and innovation by the fluid industry. Proven all its potential and managing to deliver digital transformation to this industry, it will become, in an accelerated way, hegemonic in the elaboration of solutions both for the new plants and for the updating of the current ones.

In the main, it consists of eliminating action from multiple fragmented information, generated by a considerable amount of IoTs and laboratory analysis. Puts in place scanning IoTs that use light, ultrasound, to go through a fluid column and prospect any anomalies integrating the existing IoT park. It uses advanced AI to address the complexity involved and deliver speed, process knowledge and homogeneity of actions.

The case of XMachina and its experience in the fluid industry, examined here, showed several initial aspects of this battle: cultural difficulties, technological and geographical obstacles. We have seen how AI, used by it, meets the requirements of the synthetic model and how the startup created contour solutions to the obstacles encountered.

Industries using the synthetic model with intelligent automation can eliminate bottlenecks in production processes such as: reprocessing; waiting time for assets by laboratory analysis; sub-classification of products; and losses.

Those industries that arrive first in this race will gain market share, delivering purer fluids and a new generation of premium products, which cannot even be imagined with current technologies. For the general population, it will mean more reliable products, fewer illnesses and deaths and a better preserved environment.

要查看或添加评论,请登录

Eduardo Sande的更多文章

  • A inven??o do eu, do mundo e do tempo

    A inven??o do eu, do mundo e do tempo

    O mundo de um carrapato é bem simples: ele se move em dire??o ao calor. Assim, se desloca lentamente até o cume das…

    34 条评论
  • UM áTOMO PARA CéREBROS E COGNI??O

    UM áTOMO PARA CéREBROS E COGNI??O

    Uma pergunta me intriga: Será que podemos, com todo conhecimento acumulado sobre como se produz ciência fotografar o…

    24 条评论
  • Apresentando uma nova IA: Emergente

    Apresentando uma nova IA: Emergente

    Vamos avan?ar! é preciso tornar a IA t?o acessível a ponto de qualquer crian?a puder usá-la. T?o fácil de usar como…

    1 条评论
  • O HUMANO ADVéM DE UMA ATROFIA

    O HUMANO ADVéM DE UMA ATROFIA

    Algumas ideias, mesmo quando foi você que as pensou, mudam tanto a maneira como vemos o mundo e nós mesmos, que…

    3 条评论
  • Batalha Bilionária na Indústria de Fluidos: A chegada do Modelo Sintético

    Batalha Bilionária na Indústria de Fluidos: A chegada do Modelo Sintético

    INTRODU??O Uma batalha de bilh?es de dólares se aproxima! O cenário: a indústria de fluidos. Envolve inova??o, aspectos…

    6 条评论

社区洞察

其他会员也浏览了