I ?? to create an AI digital IC engineer (1/n)
Link to AIVRAL paper on arxiv: https://arxiv.org/pdf/2409.11411
At this moment I am using those AI for various purposes:
(My own subjective aversion of openAI prevents me from evaluating it as well, which is a character flaw that might or might not impact my progress.)
Anyway. I'm not going to describe the creation of an AI based digital IC engineer, it is more an incomplete write-down of what we need to consider and think about when creating such system.
AI HDL generation today
Yesterday is totally different from today so to speak. Go back one to two months and I tried to generate simple HDL code to more advanced HDL code like an APB Bus Functional Model to see what it would produce. It is an interesting thought process that reveals things both about you and the AI. Let's just follow a random example.
Prompts, tokens, words, instructions, whatever you want to call it is the key to unlocking AI potential. Me, a human by birth, I think about something and then talk loudly about it. Ask a question or tell a story, whatever. But I often forget the context I have in my head is not known outside your own mind. So, I could be mentioning "Yeah, he got seriously cooked" and nobody would know what I am talking about. You could be talking about someone in the room or you might have been thinking about someone but forgot to mention who "he" is. Context matters. Hence the prompt "generate me a systemverilog BFM of an APB slave" contains a lot of information but also requires more information. The AI is usually implemented such that it takes the information it has and does something with it. It could ask thousands of questions to get every detail ironed out too, but that would create a lot of friction for those impatient bio-bots called humans. So, the prompt generates systemverilog code that has:
Over time,the generated code got more complicated and it wasn't always exactly the same prompt.
Prompt: "can you show me an APB v3 systemverilog slave BFM implementation?"
领英推荐
// Grok v2 beta
module apb_slave_bfm (
// APB Interface signals
input logic pclk,
input logic presetn,
input logic [31:0] paddr,
input logic psel,
input logic penable,
input logic pwrite,
input logic [31:0] pwdata,
output logic [31:0] prdata,
output logic pready,
output logic pslverr
);
// Memory to simulate the slave's response
logic [31:0] memory [0:1023]; // Assuming a small memory for example
logic [31:0] data;
// State for the APB transaction
typedef enum logic [1:0] {
IDLE = 2'b00,
SETUP = 2'b01,
ACCESS = 2'b10
} apb_state_t;
apb_state_t state, next_state;
// Sequential logic for state machine
always_ff @(posedge pclk or negedge presetn) begin
if (!presetn)
state <= IDLE;
else
state <= next_state;
end
// Combinational logic for state transitions and outputs
always_comb begin
next_state = state;
pready = 1'b0; // Default values
pslverr = 1'b0;
unique case (state)
IDLE: begin
if (psel && !penable)
next_state = SETUP;
end
SETUP: begin
if (psel && penable)
next_state = ACCESS;
else
next_state = IDLE; // If psel goes low, go back to IDLE
end
ACCESS: begin
pready = 1'b1; // Signal that the slave is ready this cycle
if (!psel)
next_state = IDLE;
else if (pwrite) begin
// Write operation
memory[paddr[11:2]] = pwdata; // Assuming word addressable, adjust if byte addressable
end else begin
// Read operation
data = memory[paddr[11:2]];
end
// Here you could add logic for pslverr if the address is invalid or operation not supported
end
endcase
// Read data output
if (state == ACCESS && !pwrite) begin
prdata = data; // Output data on read
end else begin
prdata = 'z; // High impedance when not reading
end
end
// Initialization of memory (optional for simulation)
initial begin
for(int i=0; i<1024; i++) memory[i] = i; // Just for example, fill with address value
end
endmodule
So, here you see more code, correct or not, I don't care so much at first. I like to see what choices were made and then add this to the prompt:
Hence, it is clear there are quite a few challenges to writing the prompts for HDL code generation:
Probably many more things to bring up. But major point today is also, the code the AI is trained on is a bunch of github repo's that is littered with:
Hence the labelling for verilog/systemverilog is never done, the syntax is all systemverilog but some code is behavioral, rtl, tech dependent, tool dependent, bad coding style, ... so the AI itself has no clean and extensively labelled dataset to be trained on. And if we put all the good code together, it is gonna be negligible in size versus python for example. Hence I naturally tend to favor Python as a verification environment over HDL languages. But there the issue is that there is a steep learning curve, several different implementations are available, different python version compatibility, abandoned extensions/plugins/libraries, ...
Conclusion
I want to continue this discussion with myself and others. Still, it will take a lot of time to come up with a smart enough AI that can handle the design and verification of a low complexity IP block. Maybe not, maybe there are smarties out there that could do it or have it already, who knows? At this moment, it has become usable in a "primitive" Q&A way. Maybe the road of the AI is not design and verification but maybe it is in the direction of open source software. Take all the tools and repo's, unify them in one language like Python, compatible with one python3 version and "pip install -r requirements.txt" will get you all tools for the front-end (with a choice of multiple tools if there exist multiple ones). And then the AI could self-verify the code for syntax errors and evaluate different RTL implementations based on requirements. Today, the open source repo's have all either different languages, different build tools, different dependencies, different level of documentation, different size of the user base, ... And I see regularly people inventing new HDL-ish languages to "solve" things and they never pick up steam. And the effort and community build-up fizzles out. Why?
Digital ICs are about making algorithms run in parallel form on simple logic elements. If you get your algorithmic spec as C++ (or Python), you can make the first step just making the code run parallel form on regular processors. I'd tackle that with AI first, and then work on translating the pieces into gates. I.e. humans are (maybe) good at writing linear code, and suck at parallel stuff, but gen-AI can parallelize code if it has checking mechanism (tests/symbolic simulation). Translating the small/parallel blocks of code into gates or finding the right accelerator is a tractable problem for AI.
Book author on SystemVerilog Assertions, Verilog/VHDL/design & verification processes
1 个月I strongly believe in the use of AI, like perplexity, to check and expressing requirements,, creating SV/SVA RTL and TB code. Yes, as you addressed, AI is not quite there yet, but the structures and information it brings help better express what is needed in the design and verification. I used perplexity in writing my paper (link below) to address the requirements and even fixing my English composition. The Traditional Req/Ack Handshake, It’s More Complicated Than You Think! https://systemverilog.us/vf/ReqAck90124.pdf
Founder @ LVS.ai
1 个月Unfortunately we have been 'programmed' to ascribe infinite 'subject matter expertise' to these various LLM(s) that it's evaporated our sensibility. Digital design is a very narrow field, but it is massive in knowledge, experience, reasoning and logic design. The datasets you could 'scrape' from the Internet are only a fraction of a fraction of that. When you look under the hood of AI in drug discovery you see the same cracks as you see here.