Practical notes on using generative AI in the industry
Picture generated by Generative AI about Generative AI

Practical notes on using generative AI in the industry

Generative AI, in recent times, has made the biggest splash in the tech market. Many organizations are trying to envision future solutions in light of the capabilities of Generative AI. I am going to start a series of blogs discussing practical aspects of using Generative AI in industrial solutions.

The topics will mainly revolve around B2B customers, but some exceptions will occur during the discourse. My primary focus is going to be on textual Generative AI, but I will discuss other forms wherever needed.

Scalability: The decision driver

In industry, the best solution is the one that can be deployed in multiple instances with few or no configurations. Because in this way the revenue surpasses the cost of developing and maintaining the solution. Generative AI has some big issues in this respect. Let us first identify different development and maintenance expenses of a solution based on Generative AI.

  1. Training cost: the cost you have to bear to enable the model to learn the concepts you want it to learn—for example, the internal working of special equipment.
  2. Deployment cost: if the solution is going to be deployed on-premises, then there is a significant upfront cost of buying a good enough system to run a model with billions of parameters.
  3. Inference cost: if you rely on web-hosted generative models then you will have to pay perpetual cost to the model provider. If you use an on-premises model then still you pay it differently.
  4. Development cost: this is the typical development cost of a software solution, including human resources, web services, and other similar stuff.
  5. Soft maintenance cost: this includes the maintenance of the solution itself, for example, bug fixes and updates.
  6. Hard maintenance cost: this cost is relevant to the maintenance of the infrastructure if you opt to use on-premises deployment. You can count the latter as a replacement for inference cost (point number 3).

No matter how high the cost is, a business would like to pay it, if the solution pays out its costs. The problem with the solutions developed using LLMs is that their recurring expenses can greatly exceed the initial estimates.

For example, a chatbot that searches the documentation of equipment and provides guidance in operating the equipment can become completely useless if there are major changes in the equipment and they are not yet reflected in the documentation. Even if the changes are reflected in the documentation, reprogramming may be needed to adequately cover all aspects of the documentation. If the chatbot is assisting the whole plant then changes in any equipment can cause serious issues.

Challenges to scalability

Business works best if there is a one-time construction cost of software and then it can be sold to multiple buyers. There is a cost of maintenance but the maintenance should not stop the software from working altogether. In the case of solutions relying on LLMs possibilities exist that the system can come to a grinding halt due to lack of maintenance.

In B2B solutions most enterprises use custom hardware and software combinations in their plants. Considering the same solution mentioned before that assists the operators in handling the machine situated in the plant by searching the documentation. Your solutions must always work even if one machine changes or the version of software changes.

If you are using a fine-tuned model for the plant then you face the following challenges

  • Each time the documentation of a software or a hardware component changes your model has to be retrained.
  • If you are using Retrieval Augmentation then you need to add the document to the document store. If you are getting data from multiple sources then you might need to adjust at least some of the prompts in your system.

The biggest problem is that the solution you have developed for one plant may need significant changes if you want to deploy a similar solution in another plant. The following are challenges in porting a solution to another plant

  • You will likely have to revisit the parsing pipeline because other plants or OEMs simply use different equipment or even a different vendor. So their documentation may be stored in different formats.
  • The biggest challenge you will face in parsing the documents is how you parse tables in the documentation. This is a challenging issue, and I am planning to write a separate blog on this issue.
  • The model will have to be fine-tuned if you are using a fine-tuned model.
  • If you are using Retrieval Augmentation then you will have to adjust the code and prompt for the new scenario.

These challenges make the solution hard to be reusable and the maintenance costs decrease the profit margin. In some cases, a solution already deployed in one plant may simply be unusable in another plant. This not only decreases your profit margin but also hurts your brand.

Choose profitable grounds

The challenges mentioned above should not lead to the conclusion that there is no scope for LLMs in the B2B business. If a solution is not strictly dependent on changing sources of information, then the system can perform well. Following are some hints that can enable you to choose a problem suitable for LLMs.

  • Chatbots and other conversational tools that rely entirely on documentation and machine-specific information are going to offer a challenge.
  • Instead of conversational tools use LLMs to generate code for the user. With good prompting and constraining the LLM according to the problem domain can make the system robust and repeatable.
  • Use LLMs to assist your work rather than the client's work. For example, if you are generating reports for clients, then use LLMs to generate reports, suggest new types of reports, and new presentation methods.
  • Use LLMs for specific users. For example, if a client wants to do the root cause analysis, then instead of creating a tool that analyzes all log files, create a tool that assists the maintenance engineers in their work.

Like all emerging technologies, there is hype for LLMs, but that hype is rapidly settling down. With time though, if experts share their thoughts openly, LLMs will also find their justified position in the technology horizon soon. Despite all the challenges, I believe that LLMs are going to play an important role in the development of humanity.

Disclaimer: The above article is the sole opinion of the writer, and the employer of the writer does not bear any responsibility for the opinions mentioned in the article.



要查看或添加评论,请登录

Muhammad Usman Awais的更多文章

  • The definition of a "hardcoded solution"

    The definition of a "hardcoded solution"

    When I was studying computer science, while writing a program if I would write a piece of code that would put the input…

  • Why the Generative AI is code red for third world countries.

    Why the Generative AI is code red for third world countries.

    I have just listened to the keynote address of the CEO of Nvidia Jen-Hsun Huang. In an hour-long talk, it seemed like…

  • Project-Based Learning

    Project-Based Learning

    The Problem of Learning At some point in our evolution, human beings figured out a way to learn new things and use that…

    5 条评论
  • AWS for Students at FAST

    AWS for Students at FAST

    AWS is one of the biggest giants in the IT industry. Everyday, millions of businesses rely on AWS services for hundreds…

    1 条评论

社区洞察

其他会员也浏览了