Prompt Framework

Prompt Framework

By defining a prompt framework, we put structure to our LLM inputs, which allows us to determine what information to provide, and where.??

?

Prompt

All new text submitted to the LLM. The new text is composed of parts that collectively compose a whole prompt. We will name the parts as: Query, Corpus, and Context. Prompt parts are meant to explicitly divide the prompt into mutually exclusive and collectively exhaustive parts that significantly influence the LLM output.??

?

Query

Patterns meant for the LLM to interpret, including instructions for the role the LLM plays, or the question you want the LLM to answer.??


Corpus

Any supporting text referenced by the query.?

?

Context

Any past prompts and responses that will be submitted along with the currently generated query and corpus.

?

Additional Terms

Pattern

Sentence(s) reliably understood by the LLM. Patterns can be categorized into Pattern Categories.?

?

Context Window

The sum tokens that make up the current prompt, all the past prompts and responses that make up the next submission to the model, as well as the response from the currently generated prompt. Each LLM can only work with a certain number of tokens at a time. Eg. If the LLM context window is limited to 16000 tokens, and all past prompts and responses and current prompt sum to 15000 tokens, then this only leaves 1000 tokens for the response.?The response can be continued, but with an unknown number of tokens.?

Rocío Valero Lucas - Marketing translator

English/French/German/Swedish > Spanish/Spain

1 年

Hi, Angelo. Thanks a lot for this!. I'm saving it

Martín Chamorro

AI Nerd - Facilitator - Translator & Reviewer

1 年

Hey Angelo! Thank you! This is very generous! I had been waiting for this and I actually think I missed part 2! I'll check it out and get back to you with comments. Best!

Jimena Glasman

Traductora de inglés <> espa?ol | Traductora pública CTPPC M.P. 1179 | Poseditora | Localizadora de contenido | Subtituladora | Correctora de textos | Perito auxiliar de la Justicia M.P.529

1 年

Hi Angelo! It makes perfect sense. I've never heard of "Context Window" before, that info is super useful! I don't know if you have already mentioned this (I haven't read sections 1 and 2), but I think that specifying the role you want AI to take when providing the output is important. Not only the mode and temperature, but also the type of author you want the model to be. It certainly helps me to get better output!

回复

要查看或添加评论,请登录

Angelo Passalacqua的更多文章

社区洞察

其他会员也浏览了