When a customer hands you a spreadsheet of competitor parts and asks “what do you have that matches?”, the work has traditionally meant hours of line-by-line lookups across datasheets, crossover tables, and inventory. A Circuit agent configured for the Request for Proposal domain can do most of that work for you: it reads the file, matches each line against your catalogs and crossover tables, flags lines it isn’t confident about, and writes the results back to a populated spreadsheet.Documentation Index
Fetch the complete documentation index at: https://docs.circuit.ai/llms.txt
Use this file to discover all available pages before exploring further.
Why it matters
A crossover job usually combines three slow things: reading a competitor file in whatever format the customer sent, looking up each line in your own material, and producing a clean response. Circuit collapses all three into a single chat:- Your catalogs already live in Circuit. Once datasheets, crossover tables, and inventory are in an index, the agent can query them with the same
Search,ReadPDF, andCSVQuerytools it uses for every other task. - Every line is tracked. The agent records each line as a work item, so you can reopen the chat tomorrow and see exactly which lines matched, which failed, and which the agent wants a human to confirm.
- Uncertainty is visible. When the agent is not confident about a match, it flags the line on the output and on its work item instead of guessing. A reviewer can filter the Work items panel to see only those lines.
- The output is a file, not a wall of chat. The agent writes an Excel or CSV with competitor part, matched part, description, confidence, and notes, ready to hand back to the customer.
Before you start
You need three things in place:- An index that holds your crossover material. This typically includes product catalogs, datasheets, any existing crossover tables (Excel files mapping competitor PN to your PN), and current inventory or pricing data. See indexes and managing documents.
- A
SCHEMA.mdfile next to any structured spreadsheets you want the agent to query withCSVQuery. The agent readsSCHEMA.mdto understand column types. Without it, numeric columns may be queried as text and matches can silently fail. - An agent with the Request for Proposal domain. See agents overview for how to create one; the crossover-specific setup is below.
Setting up a crossover agent
Create the agent
From the agents section, create a new agent and choose the Request for Proposal domain. This domain ships with
ReadOffice, WriteOffice, CSVQuery, Search, and ReadPDF tools, plus the work item entities the agent uses to track line items.Set the reference scope
Add the index or indexes that contain your catalogs, crossover tables, and inventory. The agent can only match against material you give it access to.
Run the bootstrap interview
Open a chat with the new agent and say:
Let’s set this agent up for crossover work.The agent runs the
bootstrap-rfp skill and interviews you: which competitors to expect, where your crossover tables live, what column layouts they use, which suppliers you represent, and how you want low-confidence matches flagged. Answer conversationally. The agent explores your reference index along the way to pre-fill what it can.Review the generated AGENT.md
The bootstrap skill writes a customized
AGENT.md into the agent’s filesystem at /Circuit/Skills/AGENT.md. Open it and confirm the reference map, competitor list, and flagging rules match how your team actually works. You can edit it anytime without going back through the interview.Run a test crossover
Attach a small sample file (ten to twenty lines is enough) and ask the agent to run a crossover. Check the output for matches you know are correct, and look at how the agent handles ambiguous lines. Adjust
AGENT.md if the tone, confidence threshold, or output format need tuning.Running a crossover
Once the agent is set up, a typical run looks like this.Start a chat and attach the competitor file
In the chat prompt, click the attach button and upload the competitor spreadsheet or PDF. The agent reads
.xlsx, .xls, .csv, and .pdf directly. You don’t need to reformat the file first.Describe the job
Tell the agent what the file is and what you want back. For example:
This is a competitor parts list from Acme Pumps. Please cross it against our catalog and write a response spreadsheet with our matched part, description, unit price, and a confidence column. Flag anything you’re not sure about for my review.Give the competitor name if you know it. The agent routes the lookup to the right crossover table when there is one.
Let the agent work
The agent creates a
WorkItemGroup for the job and a WorkItem for each line. It reads the file, looks up each line against your indexes and crossover tables, and records the result on the work item. Long jobs report progress periodically. You can leave the chat and come back later; work items persist across chats.Review the output
The agent writes a populated spreadsheet to its filesystem (by default under
/User/, or in a shared output folder you configured in AGENT.md) and links you to it in the chat. Open the file, scan the confidence column, and look at any rows the agent flagged.What the agent handles well
- Mixed file formats. Excel, CSV, and PDF price lists work without reformatting.
ReadOfficehandles spreadsheet structure.ReadPDFhandles price lists that only exist as PDFs. - Direct crossover tables. When you have an Excel file mapping competitor PN to your PN, the agent prefers it via
CSVQuery. This is the fastest and most reliable path, and it benefits most from a cleanSCHEMA.md. - Finding equivalents in datasheets. When no direct crossover table exists, the agent searches your catalogs and datasheets for equivalents using
SearchandReadPDF. - Bundles and many-to-many matches. You can ask the agent to suggest a combination of your parts that together meet a competitor bundle SKU. Capture this expectation in
AGENT.mdor the prompt so the agent treats bundle logic as in-scope. - Flagging over guessing. When the agent cannot find a confident match, it records the line as
failedor adds a review note on the work item rather than inventing a match.
Tips
- Set a confidence threshold in
AGENT.md. Decide with your team what “flag for review” means (for example, no direct crossover-table hit, or a datasheet-only match). Capture it once and the agent applies it every time. - Keep one
WorkItemGroupper customer opportunity. If the same customer sends a revised list, ask the agent to reuse the existing group and update its items rather than creating a new one. - Ask for explanations on close calls. “Why did you pick XY-500 for their A-500 line instead of XY-550?” is a useful question. The agent cites the catalog or datasheet page it used.
- Export in the format the customer wants. “Write the output as a CSV with columns Competitor PN, Our PN, Description, Unit Price, Confidence, Notes” produces a ready-to-send file instead of a generic dump.
Example chat
You: Attached is a competitor parts list from Globex, 48 lines. Please cross it against our catalog and produce an Excel response with our matched part, description, list price, and a confidence column. Flag low-confidence matches for my review.
Agent: Creating work item group “Globex crossover - 2026-04-24” with 48 line items. I’ll use the Globex crossover table for direct hits and fall back to datasheet search for anything not found there. Starting now.
Agent: Processed 48 of 48 lines. 39 matched directly from the Globex crossover table (high confidence), 6 matched via datasheet search (medium confidence, flagged), 3 had no match and are marked failed for your review. Output written to /User/globex_crossover_2026-04-24.xlsx.
You: For line 22 you flagged medium confidence. Why?
Agent: Line 22 is Globex G-PX4400. The crossover table has no entry. The closest datasheet match is our PX-4400-HD based on pressure rating and connection size, but the Globex spec lists a higher flow rate than our PX-4400-HD. Consider whether the customer needs the flow rate the competitor offers.
You: Good catch. Swap it for PX-4500-HD and close that item.
Agent: Updated line 22 to PX-4500-HD and marked the work item completed. The output spreadsheet now reflects the change.