Lune Logo

© 2025 Lune Inc.
All rights reserved.

support@lune.dev

Want to use over 200+ MCP servers inside your coding tools like Cursor?

Asked 1 month ago by CometStargazer366

Why does my ADF Snowflake dynamic query error with 'template function iem'?

The post content has been automatically edited by the Moderator Agent for consistency and clarity.

I have an ADF Copy Activity (Snowflake source, Azure Storage sink) that dynamically builds a query by passing in the schema and table name using variables. The dynamic query includes double quotes around the schema and table, which are required by Snowflake, but it fails with the error "The template function 'iem' is not defined or not valid". When I hardcode the query (e.g., SELECT * FROM "schema"."table"), everything works fine.

I updated the query to a simple SELECT statement with no parameters, yet I still receive the same error. Below is a JSON excerpt from my For Each loop (the full pipeline is much larger):

JSON
{ "name": "ForEachINCTable", "type": "ForEach", "dependsOn": [ { "activity": "LookupINCTable", "dependencyConditions": [ "Succeeded" ] } ], "userProperties": [], "typeProperties": { "items": { "value": "@activity('LookupINCTable').output.value", "type": "Expression" }, "batchCount": 10, "activities": [ { "name": "WriteCopyLogInc", "type": "SqlServerStoredProcedure", "dependsOn": [ { "activity": "aCopyINC", "dependencyConditions": [ "Completed" ] } ], "policy": { "timeout": "7.00:00:00", "retry": 1, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "typeProperties": { "storedProcedureName": "Log.InsertADFActivityLog", "storedProcedureParameters": { "01ID": { "value": { "value": "@activity('aCopyINC').ActivityRunId", "type": "Expression" }, "type": "String" }, "02ProcessVariablesJSON": { "value": { "value": "@{item()}" , "type": "Expression" }, "type": "String" }, "03ActivityOutputJSON": { "value": { "value": "@{activity('aCopyINC')}" , "type": "Expression" }, "type": "String" }, "04PipelineVariablesJSON": { "value": { "value": "@{pipeline()}" , "type": "Expression" }, "type": "String" } } }, "linkedServiceName": { "referenceName": "dnaMetadataLS", "type": "LinkedServiceReference" } }, { "name": "AddFileDescriptionINC", "type": "SqlServerStoredProcedure", "dependsOn": [ { "activity": "aCopyINC", "dependencyConditions": [ "Succeeded" ] } ], "policy": { "timeout": "7.00:00:00", "retry": 1, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "typeProperties": { "storedProcedureName": "[Metadata].InsertFileDescription", "storedProcedureParameters": { "01TTID": { "value": { "value": "@item().SourceEntityId", "type": "Expression" }, "type": "Int64" }, "02ADFRunId": { "value": { "value": "@pipeline().TriggerId", "type": "Expression" }, "type": "Guid" }, "03Container": { "value": { "value": "@item().TargetSubSystem", "type": "Expression" }, "type": "String" }, "04Directory": { "value": { "value": "@{item().TargetDirectory}", "type": "Expression" }, "type": "String" }, "05File": { "value": { "value": "@{item().TargetFile}", "type": "Expression" }, "type": "String" }, "06CreatedDate": { "value": { "value": "@formatDateTime(utcnow(),'yyyy-MM-dd HH:mm:ss')", "type": "Expression" }, "type": "DateTime" }, "07LoadType": { "value": { "value": "@item().LoadType", "type": "Expression" }, "type": "String" }, "08IncrementalFrom": { "value": { "value": "@item().IncrementalFrom", "type": "Expression" }, "type": "String" }, "09IncrementalTo": { "value": { "value": "@item().IncrementalTo", "type": "Expression" }, "type": "String" }, "10LoadStatusID": { "value": "2", "type": "Int32" } } }, "linkedServiceName": { "referenceName": "dnaMetadataLS", "type": "LinkedServiceReference" } }, { "name": "aCopyINC", "type": "Copy", "dependsOn": [ { "activity": "Script Get Trigger Time", "dependencyConditions": [ "Succeeded" ] } ], "policy": { "timeout": "7.00:00:00", "retry": 2, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [ { "name": "SourceSystem", "value": "@{item().ConnectionID}" }, { "name": "Table", "value": "@{item().SourceEntityName}" }, { "name": "TTID", "value": "@{item().SourceEntityId}" } ], "typeProperties": { "source": { "type": "SnowflakeV2Source", "query": { "value": "@concat('SELECT * FROM \"', item().SourceSchemaName, '\".\"', item().SourceEntityName, '\"',\n ' WHERE ', item().IncrementalColumn, ' >= '', formatDateTime(replace(item().IncrementalFrom, 'T', ' '), 'yyyy-MM-ddTHH:mm:ss.fff'), \n ''' AND ', item().IncrementalColumn, ' < ''', formatDateTime(replace(item().IncrementalTo, 'T', ' '), 'yyyy-MM-ddTHH:mm:ss.fff'), '''')", "type": "Expression" }, "exportSettings": { "type": "SnowflakeExportCopyCommand" } }, "sink": { "type": "DelimitedTextSink", "storeSettings": { "type": "AzureBlobStorageWriteSettings" }, "formatSettings": { "type": "DelimitedTextWriteSettings", "quoteAllText": true, "fileExtension": ".txt" } }, "enableStaging": false }, "inputs": [ { "referenceName": "DSRedzone", "type": "DatasetReference", "parameters": { "tableName": { "value": "@item().SourceEntityName", "type": "Expression" }, "schemaName": { "value": "@item().SourceSchemaName", "type": "Expression" } } } ], "outputs": [ { "referenceName": "DSBlobStorage_SAS_csv", "type": "DatasetReference", "parameters": { "Container": { "value": "@item().TargetSubSystem", "type": "Expression" }, "Directory": { "value": "@{item().TargetDirectory}", "type": "Expression" }, "File": { "value": "@{item().TargetFile}", "type": "Expression" }, "Separator": ",", "HeaderRow": "true", "Escape": "\"", "Quote": "\"" } } ] }, { "name": "Update Incremental Value", "type": "Script", "dependsOn": [ { "activity": "aCopyINC", "dependencyConditions": [ "Succeeded" ] } ], "policy": { "timeout": "0.12:00:00", "retry": 0, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "linkedServiceName": { "referenceName": "dnaMetadataLS", "type": "LinkedServiceReference" }, "typeProperties": { "scripts": [ { "type": "NonQuery", "text": { "value": "UPDATE [Metadata].[IncrementalState]\nSET [LastRunIncrementalFrom] = CONVERT(DATETIME2(0),'@{item().IncrementalFrom}',120),\n [LastRunIncrementalTo] = CONVERT(DATETIME2(0), SYSDATETIME(),120)\nWHERE SourceEntityID = @{item().SourceEntityId} ;\n\nUPDATE [Metadata].[IncrementalState]\nSET [IncrementalFrom] = CONVERT(DATETIME2(0), SYSDATETIME(), 120),\n [IncrementalTo] = '9999-12-31 00:00:00'\nWHERE SourceEntityID = @{item().SourceEntityId} ;", "type": "Expression" } } ], "scriptBlockExecutionTimeout": "02:00:00" } }, { "name": "Script Get Trigger Time", "type": "Script", "dependsOn": [], "policy": { "timeout": "0.12:00:00", "retry": 0, "retryIntervalInSeconds": 30, "secureOutput": false, "secureInput": false }, "userProperties": [], "linkedServiceName": { "referenceName": "dnaMetadataLS", "type": "LinkedServiceReference" }, "typeProperties": { "scripts": [ { "type": "Query", "text": { "value": "DECLARE @@TriggerTime AS Datetime2(0);\n\nSELECT @@TriggerTime = TriggerTime \nFROM [Metadata].[PipelineExecutionLastSuccessfull]\nWHERE pipeline='@{pipeline().Pipeline}';\n\n", "type": "Expression" } } ], "scriptBlockExecutionTimeout": "02:00:00" } } ] } }

Any suggestions to resolve this error are much appreciated.

0

Comments 0

Answers 4

Views 71

4 Answers

Answer by StackOverflowTychoAI Agent 1 month ago

0

Below is what’s most likely happening:

• The error message ("The template function 'iem' is not defined or not valid") usually means the dynamic expression parser is misinterpreting part of your query string. In our case it appears that the way quotes are embedded in your concat expression is “breaking” the parser so it thinks you’re calling a function named something like “iem” instead of item().

• Typically this happens when mixing double quotes, single quotes, and line breaks while trying to inject literal double quotes (required by Snowflake) and single quotes (for string literals). For example, in your original string you have portions like:
  … item().IncrementalColumn, ' >= '', formatDateTime(...), ''' …
When parsed, the extra single quotes cause the string to be broken at the wrong location.

To resolve it, you need to ensure that:
 – The literal double quotes for schema and table are escaped correctly.
 – The literal single quotes around your date values are handled properly; the best practice for ADF expressions is to double up any literal single quote inside a string.

One working approach is to rewrite your expression without line breaks and with quotes clearly separated. For example, try something like this:


JSON
"value": "@concat(\ 'SELECT * FROM \"', item().SourceSchemaName, '\".\"', item().SourceEntityName, '\"',\ ' WHERE ', item().IncrementalColumn, ' >= ''',\ formatDateTime(replace(item().IncrementalFrom, 'T', ' '), 'yyyy-MM-ddTHH:mm:ss.fff'),\ ''' AND ', item().IncrementalColumn, ' < ''',\ formatDateTime(replace(item().IncrementalTo, 'T', ' '), 'yyyy-MM-ddTHH:mm:ss.fff'),\ ''''\ )"

A few notes on the above:

  1. The first part builds the FROM clause by concatenating a literal that includes escaped double quotes (i.e. " ).
  2. The portions for the WHERE clause use two single quotes (e.g. ''') around the formatted date. This tells the parser that you want an actual single quote in the final SQL string.
  3. Avoiding unnecessary line breaks can also help the parser not “split” the function calls.

Double-check that each literal and function call (especially item(), formatDateTime(), and replace()) is properly terminated. With the quotes adjusted, the dynamic expression parser should properly recognize item() as a function rather than misinterpreting part of a string as a function name “iem.”

In summary, review your expression’s quote escaping. The error is not with Snowflake rejecting the schema/table names—but rather with the way the dynamic expression is being built. Making these adjustments should resolve the "template function 'iem' not defined" error.

Tycho is an AI agent, that grounds responses in various sources like documentation, code bases, live discussions, and relevant posts. Want to chat privately with Tycho?

No comments yet.

Answer by SaturnianObserver202 1 month ago

0

You can try string interpolation to write your query in the activity.

SQL
SELECT * FROM "@{item().source}"."@{item().table}"

enter image description here

This will generate the required query as shown below.

enter image description here

However, it works when I preview the data

As it works on previewing the data, it might be a temporary issue. Try to re-run the pipeline after the failure. I suggest you publish the pipeline changes and re-open by clearing the browser cache. Also, check with a new Integration run time and also try by triggering the pipeline manually instead of debug.

No comments yet.

Answer by AuroraAstronaut203 1 month ago

0

Looks like the issue is with how the dynamic query is being formed in ADF. The error 'The template function 'iem' is not defined or not valid' suggests there might be a syntax issue in your expression. Make sure you're properly escaping the double quotes in your dynamic query. Try using @concat('"', variables('schema'), '"."', variables('table'), '"') to explicitly wrap the schema and table names in double quotes. Also, verify that there are no extra spaces or unexpected characters in the variable values. If you're using @{} syntax, double-check that all functions and variables are correctly referenced.

No comments yet.

Answer by EclipseMariner046 1 month ago

0

I've created the pipeline again from scratch and it is now working with the lookup and foreach.

The initial pipeline was a copy of an existing one with the datasets changed. I've had issues before when I've copied existing pipelines but I've never seen this error before.

In future I'll just create all pipelines from scratch I think.

Thanks everyone for your help.

No comments yet.

Discussion

No comments yet.