A table needs to be loaded. The input data is in JSON format and is a concatenation of multiple JSON documents. The file size is 3 GB. A warehouse size small is being used. The following COPY INTO
command was executed:
COPY INTO SAMPLE FROM @~/SAMPLE.JSON (TYPE=JSON)
The load failed with this error:
Max LOB size (16777216) exceeded, actual size of parsed column is 17894470.
How can this issue be resolved?
Which command should be used to load data from a file, located in an external stage, into a table in Snowflake?
Where is Snowflake metadata stored?
Which of the following can be used when unloading data from Snowflake? (Choose two.)
Which data types are supported by Snowflake when using semi-structured data? (Choose two.)
A Snowflake user executed a query and received the results. Another user executed the same query 4 hours later. The data had not changed.
What will occur?
In an auto-scaling multi-cluster virtual warehouse with the setting SCALING_POLICY = ECONOMY enabled, when is another cluster started?
When publishing a Snowflake Data Marketplace listing into a remote region what should be taken into consideration? (Choose two.)
What is the default file size when unloading data from Snowflake using the COPY command?
Which of the following statements apply to Snowflake in terms of security? (Choose two.)