Best Practices
Best Practices for Interacting with Finnex AI
Follow these guidelines to ensure smooth, efficient, and cost-effective use of the FINNEX AI Assistant.
🧩 When Interacting with Excel
Always specify the range (e.g.,
Sheet1!B2:D10) when asking AI to read or write data.If no range is provided, AI will act on the active cell by default.
Use “My Excel” in your query for better recognition.
📎 When Interacting with Attachments
Use the Upload button to attach documents (PDF, PNG, JPG, TXT, etc.).
Multiple files can be uploaded, but the combined size must stay within 50,000 tokens (~40k words).
Refer to files as “Attached file” when asking AI to extract data or answer questions.
🗃️ When Interacting with the Database
Ensure uploaded data is clean and well-structured.
Always include a Unique Identifier (Primary Key) when updating records.
Use “My database” or “My DB” in your prompts for accurate database handling.
Best Practices for Efficiency & Cost Control
These practices help reduce token usage and optimise performance:
🔄 Start Fresh in a New Sheet
AI outputs overwrite Excel content. Use a new sheet to avoid accidental data loss.
⚠️ Avoid Sending Large Datasets to AI
GPT models are not optimised for processing large datasets directly.
Best Practice:
Upload large data sets to the Database tab, then ask AI to analyse data from there.🧹 Clean Up Chat Threads Regularly
Older threads add context, increasing token usage and slowing responses.
Best Practice:
Start a new thread for new topics. Use AI to summarise previous threads if needed before continuing.🧠 Be Specific & Concise
AI performs best with clear and to-the-point instructions.
Best Practice:
Avoid vague or overly long prompts. Treat the AI like a skilled assistant who doesn’t know your business yet.🧰 Use Built-In Features Instead of Repetitive Prompts
Use the Excel-Database integration, automation library, or predefined templates where possible.
Best Practice:
Let built-in tools handle repetitive steps to reduce token consumption.🧼 Preprocess and Filter Data First
Pre-filter your data in Excel before passing it to AI for analysis.
Best Practice:
Limit the data scope to only what's relevant. This minimizes token usage and improves performance.📊 Monitor and Manage Your Credits
Credits = Tokens. Watching your usage helps you control costs.
Best Practice:
Use the dashboard to monitor token consumption and plan accordingly.📅 Plan Your Usage in Advance
Anticipate your AI needs and buy credits accordingly.
Best Practice:
Buy credits in bulk when discounted or during high-usage periods. Choose the right plan based on your patterns.🤖 Use Automation (Enterprise Users Only)
The Library interface allows scripting of common tasks to save time and tokens.
Best Practice:
Automate routine tasks to reduce repeated AI queries.🔧 Review and Refine Scripts Regularly (Enterprise Users Only)
Outdated scripts may be inefficient or error-prone.
Best Practice:
Regularly review and update your automation scripts for improved accuracy and token efficiency.