I am experimenting with Biglake Iceberg tables in BigQuery.
I have created a Biglake Iceberg table in BigQuery, and added few records to the table using the BigQuery console. On the other end, I can read the records from the table in Spark via Biglake Metastore catalog implementation and through BigQuery read API.
But when I try to insert further records in the BigQuery console, the newly added records are not getting reflected in the Spark, even if I try to load the table again. After sometime it's getting reflected. Whereas the inserts from Spark to the table is immediately reflected in the BigQuery end.
I would like to understand why the sync from BigQuery to Spark is not immediate