Download Results

Every answer THiNK generates can be saved. At the bottom of each response you will find a small action bar with options to copy, give feedback, view references, and download all output files produced during that analysis.

The Response Action Bar

Each response ends with a row of actions:

Search for exonic demonstration
  • Copy — copies the response text to your clipboard
  • Thumbs up / Thumbs down — submit feedback on the response quality
  • View References — opens the full list of citations and sources used in the analysis, with a count badge showing how many are available
  • Download Results — opens the Output Files panel for that response

The Output Files Panel

Clicking Browse and Download Results in the Download Results menu opens a window showing every file THiNK generated for that response. For a typical analysis like a family trio variant review, this might include:

  • Answer files — the main written response as a Markdown file, with a separate verified version (e.g. ANSWER_Family_Trio_Variant_1_verified.md)
  • internet-research/ — sources and references gathered during the analysis
  • python/ — the scripts used to run statistical tests and generate plots
  • qc/ — quality control outputs
  • ancestry/ — ancestry analysis results, if applicable
  • queries/ — the database queries run against variant annotation sources

Click Download All at the bottom of the panel to save everything as a zip, or expand individual folders to download specific files.

Verified vs. unverified answers

THiNK produces two versions of each answer: an initial response and a verified version that has been cross-checked against reference databases. The verified file is the one to use for reporting and sharing with collaborators.

Reproducibility

The Python scripts in the output include all parameters used during the session, so any analysis can be re-run exactly in your own environment with the same input data.

💡 Tips

  • Use the verified answer file when sharing results with collaborators or including findings in reports
  • The Python scripts and query files are useful for auditing exactly what the agent did to produce a result