The Power of Full Project Context #LLM
Alan Turing & The Turing Machine

The Power of Full Project Context #LLM

I've tried integrating RAG into the DevoxxGenie plugin, but why limit myself to just some parts found through similarity search when I can go all out?

RAG is so June 2024 ??

Here's a mind-blowing secret: most of the latest features in the Devoxx Genie plugin were essentially 'developed' by the latest Claude 3.5 Sonnet large language model using the entire project code base as prompt context ?? ??

It's like having an expert senior developer guiding the development process, suggesting 100% correct implementations for the following Devoxx Genie features:

  • Allow a streaming response to be stopped
  • Keep selected LLM provider after settings page
  • Auto complete commands
  • Add files based on filtered text
  • Show file icons in list
  • Show plugin version number in settings page with GitHub link
  • Support for higher timeout values
  • Show progress bar and token usage bar

I've rapidly stopped my OpenAI subscription and gave my credit card details to Anthropic...

Full Project Context

A Quantum Leap Beyond GitHub Copilot

Imagine having your entire project at your AI assistant's fingertips. That's now a reality with the latest version of the Devoxx Genie IDEA plugin together with cloud based models like Claude Sonnet 3.5.

BTW How long will it take until we can do this with local models?!

Add full project to prompt

The latest version of the plugin allows you to add the full project to your prompt, your entire codebase now becomes part of the AI's context. This feature offers a depth of understanding that traditional code completion tools can only dream of.

Smart Model Selection and Cost Estimation

The language model dropdown is not just a list anymore, it's your 'compass' for smart model selection ?? ????

  • See available context window sizes for each cloud model
  • View associated costs upfront
  • Make data-driven decisions on which model to use for your project

The new language model dropdown

Visualizing Your Context Usage

Leverage the prompt cost calculator for precise budget management:

  • Track token usage with a progress bar

  • Get real-time updates on how much of the context window you're using

Calculate token cost with Claude Sonnet 3.5
Calculate cost with Google Gemini 1.5 Flash

Cloud Models Overview

Via the plugin settings pages you can see the "Token Cost & Context Window" for all the available cloud models. In a near future release you will be able to update this table. I should probably also support the local models context windows... #PullRequestsAreWelcome

Token Cost & Context Window

Handling Massive Projects?

"But wait, my project is HUGE!" you might say ?? Fear not. We've got options:

  1. Leverage Gemini's Massive Context:

Gemini's colossal 1 million token window isn't just big, it's massive. We're talking about the capacity to ingest approximately 30,000 lines of code in a single prompt. That's enough to digest many codebases, from the tiniest scripts to some decent big projects.

But if that's not enough you have more options...

BTW Google will be releasing 2M and even 10M token windows in the near future

2. Smart Filtering:

The new "Copy Project" plugin settings panel lets you

  • Exclude specific directories
  • Filter by file extensions
  • Remove JavaDocs to slim down your context

3. Selective Inclusion

Right-click to add only the most relevant parts of your project to the context and/or clipboard.

You can also copy your project to the clipboard, allowing you to paste your project code into an external chat window. This is a useful technique for sharing and collaborating on code ????

????Add Project Folders & Files using right-click

The Power of Full Context: A Real-World Example

The DevoxxGenie project itself, at about 70K tokens, fits comfortably within most high-end LLM context windows. This allows for incredibly nuanced interactions – we're talking advanced queries and feature requests that leave tools like GitHub Copilot scratching their virtual heads!

Conclusion: Stepping into the Future of Development

With Claude 3.5 Sonnet, Devoxx Genie isn't just another developer tool... it's a glimpse into the future of software engineering. As we eagerly await Claude 3.5 Opus, one thing is clear: we're witnessing a paradigm shift in AI-augmented programming.

Alan Turing, were he here today, might just say we've taken a significant leap towards AGI (for developers with Claude Sonnet 3.5)

Welcome to the cutting edge of AI-assisted development - welcome to DevoxxGenie ??

X Twitter - GitHub - IntelliJ MarketPlace

DevoxxGenie is fully open source


So the entire project’s codebase is given to Claude which runs on Antropic’s servers?

回复
Barry van Someren

Java Application Hosting & Support | PostgreSQL & Kubernetes Administration | Empowering Web Development Agencies

7 个月

Man, I love living in the future. Excellent work, Stephan!

Lize Raes

Software Engineer and Product Manager

7 个月

Waw this is impressive and next-level! Amazing how fast DevoxxGenie has developed, and one can see you have been intensively using it yourself, allowing you to bring exactly those features that developers need! Thanks a lot for this excellent work!

Tom Cools

DevRel at Timefold, Java Champion, BeJUG organizer, (Keynote) Conference Speaker

8 个月

What you have been doing with LLMs and DevoxxGenie is truely amazing! Congratz Stephan, looking forward to upgrading my plugin and trying this out! ??

要查看或添加评论,请登录

Stephan Janssen的更多文章

  • 10K+ Downloads Milestone for DevoxxGenie!

    10K+ Downloads Milestone for DevoxxGenie!

    I'm excited to share that DevoxxGenie has hit a major milestone: over 10,000 downloads! The actual number is likely…

    2 条评论
  • Running the full DeepSeek R1 model at Home or in the Cloud?

    Running the full DeepSeek R1 model at Home or in the Cloud?

    The DeepSeek R1 model, a massive 671B parameter Mixture-of-Experts (MoE) model, demands significant computational…

    7 条评论
  • Large Language Models related (study) material

    Large Language Models related (study) material

    This week I spoke at VoxxedDays CERN and Ticino (including a keynote). Received lots of great feedback but also several…

  • LLM Inference using 100% Modern Java ????

    LLM Inference using 100% Modern Java ????

    In the rapidly evolving world of (Gen)AI, Java developers now have powerful new (LLM Inference) tools at their…

    5 条评论
  • Basketball Game Analysis using an LLM

    Basketball Game Analysis using an LLM

    I asked OpenAI's ChatGPT and Google Gemini to analyze some game snapshots, and it's incredible how well they break down…

    5 条评论
  • Using LLM's to describe images

    Using LLM's to describe images

    I've already worked on face recognition many years ago, so the natural next step is to use a Large Language Model (LLM)…

    1 条评论
  • Devoxx Genie Plugin : an Update

    Devoxx Genie Plugin : an Update

    When I invited Anton Arhipov from JetBrains to present during the Devoxx Belgium 2023 keynote their early Beta AI…

    1 条评论
  • MLX on Apple silicon

    MLX on Apple silicon

    "MLX is an array framework for machine learning on Apple silicon, brought to you by Apple machine learning research…

    1 条评论
  • Streamlining Your IDE with a Local LLM AI Assistant: A Quick Guide

    Streamlining Your IDE with a Local LLM AI Assistant: A Quick Guide

    The current "AI Assistant" plugin for IntelliJ operates exclusively online, as it leverages a cloud-based GPT-4…

    6 条评论
  • Jlama : LLM meets Java (Vector)

    Jlama : LLM meets Java (Vector)

    Jlama is the first #LLM project I've come across which is entirely developed in #Java, leveraging the jdk.incubator.

    3 条评论

社区洞察

其他会员也浏览了