Nexa Concepts and Context (Beta)
Tokens and LLMs
Tokens are the fundamental units of text that large language models (LLMs) process. Nexa interprets text by breaking it into tokens, analyzing them, and predicting subsequent tokens to generate responses. Understanding tokens and their usage is crucial for effective use of Nexa. Tokenization influences how queries are processed, the accuracy of results, and the overall efficiency and cost of operations. Conviva continues to optimize token usage to ensure responses remain accurate, efficient, and cost-effective.
For example, the sentence “Nexa makes data simple” may be split into tokens such as “Nexa”, “makes”, “data”, and “simple”.
LLMs
LLM stands for Large Language Model. Nexa generates code using Gemini 2.5 Flash, which provides robust programming performance.
Note: Customer data from Nexa is never used to train Gemini AI.
At a high level, LLMs function as advanced autocomplete systems. They do not possess knowledge in the human sense; instead, they generate outputs by identifying and applying patterns learned during training. In Nexa, the LLM generate the insights that aligns with a given prompt, and the dashboard.
Token limits
LLMs can only handle a certain number of tokens at a time, which includes both:
The input you give (for example, a long question or document)
The output it generates (for example, the response, or the code)
System Prompts or contexts
Prompts
A prompt is a message you send to the AI. How you write your prompts is key to getting good results.
Context
Context refers to all domain information available to the AI, including prompts, prior responses, and existing code. The context window defines the amount of text an LLM can process simultaneously. Nexa provides a large context window; however, the AI cannot be relied upon to retain the entire conversation. Larger context windows also consume tokens more quickly, which increases cost. To preserve context while maintaining a smaller window, the AI can summarize the conversation to date and then reset the context window.
Prompt Nexa Prompt Tokens LLMs Nexa