This is a dedicated watch page for a single video.
You are running a BigQuery query that filters a large table using a WHERE clause on timestamp and ID columns. A dry run using bq query --dry_run reveals that the entire table is being scanned, even though the filter should return only a small subset of the data. You want to minimize the amount of data scanned with minimal changes to your existing SQL queries . What should you do to improve query performance and reduce costs?