Back to The Times of Claw

DenchClaw App Performance Tips

Performance tips for DenchClaw App Builder developers: optimize DuckDB queries, use caching with dench.store, lazy load data, and avoid common bottlenecks.

Mark Rachapoom
Mark Rachapoom
·6 min read
DenchClaw App Performance Tips

DenchClaw App Performance Tips

A Dench App that loads slowly is a Dench App nobody uses. The good news: DuckDB is extremely fast for analytical queries, and the window.dench bridge API has caching primitives that make subsequent loads near-instant. The bad news: it's easy to accidentally write slow code patterns that kill performance.

This guide covers the most common performance bottlenecks in Dench Apps and how to fix them.

1. Optimize Your DuckDB Queries#

DuckDB is columnar and fast, but badly written queries can still be slow on large datasets. The most common issues:

Problem: Selecting * when you only need a few columns#

// Slow: loads all columns even if you display 3
const data = await dench.db.query("SELECT * FROM v_people");
 
// Fast: only load what you need
const data = await dench.db.query(`
  SELECT id, "Full Name", "Email Address", "Status"
  FROM v_people
  WHERE "Status" = 'Lead'
`);

The difference matters because v_people is a PIVOT view that recomputes all fields. Selecting fewer columns reduces the work done.

Problem: Fetching all rows and filtering in JavaScript#

// Slow: loads 5000 rows into the browser, filters in JS
const all = await dench.db.query("SELECT * FROM v_people");
const filtered = all.filter(p => p.Status === 'Lead');
 
// Fast: let DuckDB filter
const filtered = await dench.db.query("SELECT * FROM v_people WHERE \"Status\" = 'Lead'");

Always push filters, limits, and aggregations into the SQL query. DuckDB handles this orders of magnitude faster than JavaScript.

Problem: Running N queries in a loop#

// Very slow: one query per deal (N+1 problem)
const deals = await dench.db.query("SELECT * FROM v_deals");
for (const deal of deals) {
  const contact = await dench.db.query(`SELECT * FROM v_people WHERE id = '${deal.contact_id}'`);
}
 
// Fast: single JOIN query
const dealsWithContacts = await dench.db.query(`
  SELECT d.*, p."Full Name" AS contact_name, p."Email Address" AS contact_email
  FROM v_deals d
  LEFT JOIN v_people p ON p.id = d."Contact"
`);

Never query inside a loop. Always JOIN.

Use LIMIT for large datasets#

If your app only shows the top 20 results, add LIMIT 20 to your query. Don't load 5000 rows and show 20.

2. Cache with dench.store#

Some data doesn't change often. CRM schemas, static reference data, and heavily computed aggregations are good candidates for caching.

const CACHE_TTL = 5 * 60 * 1000; // 5 minutes in milliseconds
 
async function getCachedData(key, queryFn) {
  const cached = await dench.store.get(key);
  const cachedAt = await dench.store.get(key + '_timestamp');
  
  if (cached && cachedAt && (Date.now() - cachedAt) < CACHE_TTL) {
    return JSON.parse(cached);
  }
  
  const fresh = await queryFn();
  await dench.store.set(key, JSON.stringify(fresh));
  await dench.store.set(key + '_timestamp', Date.now());
  return fresh;
}
 
// Usage
const contacts = await getCachedData('contacts_list', () =>
  dench.db.query("SELECT id, \"Full Name\", \"Company\" FROM v_people ORDER BY \"Full Name\"")
);

Cache lookup is synchronous and fast (sub-millisecond). Database queries are async and can take 50-500ms for large datasets.

What to cache: Contact/company lists for dropdowns, schema information, aggregated totals that don't change often.

What not to cache: Real-time pipeline data, anything that should reflect the latest CRM state immediately.

3. Load Data Lazily#

Don't fetch all data at startup. Fetch what's needed for the initial render, then load more on demand.

// Bad: loads everything upfront
async function init() {
  const allContacts = await dench.db.query("SELECT * FROM v_people"); // May be slow
  const allDeals = await dench.db.query("SELECT * FROM v_deals");
  const allCompanies = await dench.db.query("SELECT * FROM v_companies");
  renderAll();
}
 
// Good: load incrementally
async function init() {
  // Load only what's visible on initial render
  const recentDeals = await dench.db.query("SELECT * FROM v_deals ORDER BY updated_at DESC LIMIT 20");
  renderDeals(recentDeals);
  
  // Load secondary data asynchronously after initial render
  requestIdleCallback(async () => {
    const contactDropdownData = await dench.db.query("SELECT id, \"Full Name\" FROM v_people");
    populateContactDropdown(contactDropdownData);
  });
}

Progressive Loading Pattern#

For paginated data, implement infinite scroll or a "Load more" button:

let offset = 0;
const PAGE_SIZE = 25;
 
async function loadPage() {
  const data = await dench.db.query(`
    SELECT * FROM v_deals ORDER BY updated_at DESC LIMIT ${PAGE_SIZE} OFFSET ${offset}
  `);
  renderRows(data, offset === 0);
  offset += data.length;
  document.getElementById('loadMore').style.display = data.length < PAGE_SIZE ? 'none' : 'block';
}

4. Widget Mode Performance#

Widgets render in a compact grid and refresh on a timer. Performance requirements are stricter:

  • Keep queries minimal: A widget showing 4 KPIs should run 4 targeted SQL aggregations, not a full table scan
  • Use COUNT and SUM, not SELECT *: SELECT COUNT(*) FROM v_deals WHERE "Stage" = 'Qualified' is much faster than loading all deals and counting in JS
  • Set an appropriate refresh interval: refreshMs: 300000 (5 minutes) is usually fine. 60000 (1 minute) is the minimum for most widgets. Don't go lower unless you have a real-time requirement.
// Good widget query: aggregation at the DB level
const kpis = await Promise.all([
  dench.db.query("SELECT COUNT(*) AS c FROM v_deals WHERE \"Stage\" NOT IN ('Closed Won', 'Closed Lost')"),
  dench.db.query("SELECT SUM(CAST(\"Value\" AS DOUBLE)) AS total FROM v_deals WHERE \"Stage\" = 'Closed Won' AND DATE_TRUNC('month', CAST(\"Close Date\" AS DATE)) = DATE_TRUNC('month', CURRENT_DATE)")
]);

5. Avoid Layout Thrashing#

When rendering large tables or lists, don't read and write DOM properties alternately. Build your HTML string first, then set it once:

// Bad: causes layout thrashing
rows.forEach(row => {
  const tr = document.createElement('tr');
  tr.style.height = tr.offsetHeight + 10 + 'px'; // Forces reflow on each iteration
  tbody.appendChild(tr);
});
 
// Good: build string, set once
const html = rows.map(row => `<tr>...</tr>`).join('');
tbody.innerHTML = html;

6. Use requestIdleCallback for Non-Critical Work#

Defer non-critical initialization (tooltip setup, animation setup, secondary data load) until the browser is idle:

// Initial render first
renderPrimaryContent(data);
 
// Non-critical setup deferred
requestIdleCallback(() => {
  setupTooltips();
  loadSecondaryMetrics();
  prefetchAdjacentPages();
});

Frequently Asked Questions#

How do I profile my DuckDB query performance?#

Add timing around your query: const t = Date.now(); const result = await dench.db.query(...); console.log('Query time:', Date.now() - t, 'ms'). For queries over 500ms, check for missing filters, SELECT * patterns, or N+1 problems.

Is there a query result size limit?#

The bridge API returns query results as JSON arrays in memory. For very large result sets (10k+ rows), the serialization overhead can be significant. Always use LIMIT and push aggregations to DuckDB.

How do I speed up a slow JOIN?#

DuckDB is very efficient at JOINs on indexed fields. Ensure you're joining on id fields (which are always available). Avoid joining on computed or text fields when possible.

Typical DuckDB performance: queries on 50k-row tables complete in under 100ms on modern hardware. Multi-million row tables may need more careful query design but are still fast for aggregation queries. The bottleneck is usually serialization (converting results to JSON for the bridge), not the query itself.

Ready to try DenchClaw? Install in one command: npx denchclaw. Full setup guide →

Mark Rachapoom

Written by

Mark Rachapoom

Building the future of AI CRM software.

Continue reading

DENCH

© 2026 DenchHQ · San Francisco, CA