Skip to main content
insightsoftware Documentation insightsoftware Documentation
{%article.title%}
Published:
Was this article helpful?
0 out of 0 found this helpful

ImportSpec and ExportSpec Functions

A function is an instruction word representing an action you want to perform in Longview Application Framework. You can carry out the action by typing a function statement in an ImportSpec or ExportSpec Document in Longview Application Framework.

Note: Unless otherwise specified, the following functions are available for use in both ImportSpecs and ExportSpecs.

The following functions apply to DataAreas:

Assign ErrorFile Put
DataSource (ImportSpec only) IgnoreHeaderRecords ReverseSign
DataTarget (ExportSpec only) IgnoreFooterRecords ScientificNotation
DecimalCharacter IncludeHeader (ExportSpec only) Search
Define (for DataAreas) LogFile Set
DuplicateMappings MaxError ValueField
DuplicateRecords    

The following functions apply to DataTables:

DataSource (ImportSpec only) IgnoreHeaderRecords RegionalDates
DataTarget (ExportSpec only) IgnoreFooterRecords ScientificNotation
DecimalCharacter IncludeHeader (ExportSpec only) Search
Define (for DataTables) LogFile Set
ErrorFile QuotedStrings StringEncodedSymbolNames

Assign

This function does not apply to ImportSpecs and ExportSpecs for DataTables.

Use this Longview Application Framework function to designate dimension names when each external record contains, or will get, multiple value fields, providing a way to associate each value field with specific symbols.

The ASSIGN function can only be used in a single dimension.

Syntax:

ASSIGN DimName

where:

  • DimName is the dimension across which the multiple values will be associated.

The ASSIGN function must be used with the PUT function, and can only be used if the dimension has not been set by DEFINE, and if the VALUEFIELD function has not been used.

See also

DataSource

Use this function with import specification to define the source of the data to be imported. This keyword is available in a ImportSpecs only, is mandatory, and specifies the type and source of data for the import operation. The parameters for the DataSource function depend on the specified type of connection.

Syntax for Text

DataSource SourceType, "FileName", EXTERNAL[, "Delimiter|REGIONAL"]

where:

  • SourceType is one of the following:
    Parameter Description

    TEXT

    Imports the data from a delimited text file.

    JSON

    Imports the data from a JSON formatted file. The JSON file can be either verbose or concise format. The import will automatically determine which one it is.

    ODBC

    Imports the data directly from a database using ODBC connection.

  • FileName is the name of the source file and it must be enclosed in double quotation marks.

    Note: By default, ODBC connections through Application Framework are allowed. To restrict all ODBC connections through Application Framework, open the lv_af.cfg file found in the working directory of the lv_af.exe and add the following statement to the end of the document: PERMIT_DATASOURCE_ODBC=FALSE

  • Delimiter is the character that is used to separate the fields in the source file and must be enclosed in double quotation marks. If not specified, the default delimiter is {. Alternatively, you use the keyword REGIONAL, which will use the client machine’s regional settings to determine the delimiter character to use. Use this parameter when using the TEXT.

Example:

DataSource TEXT, "MyData", EXTERNAL, "{"

DataSource JSON, “MyData”, EXTERNAL DataSource ODBC, "Driver={SQL Server Native Client10.0};Server=localhost;Database=mydb ;Trusted_Connection=Yes;"

See also

DataTarget

Use this function with export specification to define the target of the data to be exported. This keyword is available in a Data Export Specification only, is mandatory, and specifies the target and type of data for the export operation.

Syntax:

DataTarget TargetType, "FileName", EXTERNAL, ["Delimiter|REGIONAL"] [,VERBOSE|CONCISE]

where:

  • TargetType is one of the following:

    Note: DataTables do not support ODS or JSON.

    Parameter Description

    TEXT

    Exports the data to a text file. You must use this keyword if you are exporting from a DataTable..

    JSON

    Exports the data in JSON format. The JSON file can be either verbose or concise format.

    ODBC

    Exports the data to an ODS file. For more information on ODS, see the “Longview Integration Guide”.

  • "FileName" is the name of the source file and must be enclosed in double quotation marks.

    Note: If the document is in the same location as lv_af.exe, you do not need to specify the drive or path. If the path specified does not exist, it will be created. The folder path can have a maximum length of 260 characters.

  • Delimiter is the character used to separate the fields in the ouput file and must be enclosed in double quotation marks. If not specified, the default delimiter is {. Alternatively, you use the keyword REGIONAL, which will use the client machine’s regional settings to determine the delimiter character to use. Use this parameter when using the TEXT.

    VERBOSE|CONCISE is the json format the data will be output as. This is an optional parameter and will default to VERBOSE if not specified. For more information on the output type VERBOSE and CONCISE see Sample JSON source files and ImportSpec for DataArea Import.

Syntax example:

DataTarget TEXT, "MyData", INTERNALEXTERNAL

DataTarget TEXT, "MyDataTable", EXTERNALINTERNAL, "REGIONAL"

DataTarget JSON, “MyData”, EXTERNAL, CONCISE

DataTarget JSON, “MyData”, EXTERNAL

DecimalCharacter

This function does not apply when JSON is used for DataSource or DataTarget.

Use this function to specify whether the character that is used as a decimal character, in the source or target file, is a comma or a period.

Syntax:

DecimalCharacter "Character|REGIONAL"

where:

  • Character is the decimal character for the import or export file and can be either a period ( . ) or a comma ( , ). The default decimal character is a period ( . ).
  • REGIONAL specifies to use the client machine’s regional settings to determine the decimal character for import and export. This keyword must be enclosed in double quotation marks.

Syntax example:

DecimalCharacter "," DecimalCharacter "REGIONAL"

Define (for DataAreas)

Use this function to specify how a field in the source file will map to the dimensions.

Syntax:

DEFINE DimName, MapMethod, FieldName|SymName[, MapName, INTERNAL|EXTERNAL]

where:

  • DimName is the dimension being defined.
  • MapMethod is a mapping method and is one of the following:
    Value Description

    MATCH

    The values encountered in field FieldName must match the dimension symbols exactly.

    MAP

    The values encountered in field FieldName will be mapped between dimension symbols via the Map specified by MapName.

    UNIQUE

    Import operations: all values encountered will be placed in the symbol SymName (symbol names are not derived from any source field).

    Export operations: only values in the source DataArea belonging to the symbol SymName will be exported.

    CONSECUTIVE

    Values will be placed consecutively (in the order they are encountered) in the dimension starting at symbol SymName, moving in priority order through that symbol's siblings. CONSECUTIVE is designed to work with a symbol existing in only one hierarchy - if the symbol belongs to more than one hierarchy, the behavior used to determine the siblings is undefined and may be affected by hierarchy reorganizations and import/exports. If the dimension is a schedule dimension, there is no concept of hierarchy or priority, therefore symbols will be used in defined order. If the end of the sibling or symbol list is reached, the next value will generate an error.

    Note: The CONSECUTIVE parameter is available for use in ImportSpecs only.

  • FieldName|SymName is one of the following:
    Value Description
    FieldName

    This parameter applies only when MapMethod is MATCH or MAP.

    This can be any alphanumeric identifier, and is used only within the scope of the import action. This parameter must match exactly the FieldName specified in your Set function.

    SymName

    This parameter applies only when MapMethod is UNIQUE or CONSECUTIVE.

    This is the name of a valid symbol for the specified dimension.

  • MapName is required when MapMethod is MAP and is one of the following:
    • the name of a map that exists inside the database, created using the Mappings editor. You must specify INTERNAL as the map location for this type of map.
    • the path and file name, including the relevant extension, of a map that exists as a file, such as a text file on your local machine or network location. If you do not specify the path, the system assumes the path is C:\...\MyDocuments\Longview. You must specify EXTERNAL as the map location for this type of map.

A valid ImportSpec or ExportSpec must contain a DEFINE or ASSIGN for each base dimension in the DataArea. If the DataArea contains schedule data, each extra dimension must also relate to a DEFINE or ASSIGN.

Syntax example:

DEFINE Accounts, MATCH, Acct

DEFINE TimePeriods, MAP, TimePer, TimePeriodsMap, INTERNAL DEFINE Entities, MAP, Ent, MyMaps\EntitiesMap.txt, EXTERNAL

DEFINE Currencies, UNIQUE, USD

DEFINE Details, CONSECUTIVE, AgingT

It is possible to use a single named field in an Import Spec with one or more keywords (Define, Put, ValueField) to avoid outside transformation of the source file.

To achieve this, you can use multiple define statements which refer to the same fieldname but different map files. For example:

Sample Data File

72740000, Bike, 123.45

72740001, Skateboard, 987.65

73740000, Bike, 12.34

73740001, Skateboard, 98.76

Sample ImportSpec

Set AccEnt, 1

Set Tim, 2

Set Det, 4

Set Cur, 5

Set Seg, 6

Set Ele, 7

Set Con, 8

Set Val, 9

Define ACCOUNTS, Map, AccEnt, AccountMapping, INTERNAL

Define TIMEPER, Match, Tim

Define ENTITIES, Map, AccEnt, EntityMapping, INTERNAL

Define DETAILS, Match, Det

Define CURRENCY, Match, Cur

Define SEGMENTS, Match, Seg

Define ELEMENTS, Match, Ele

Define CONTROLS, Match, Con

ValueField Val

Sample AccountMapping

40000, Wildcard, ???40000

40001, Wildcard, ???40001

Sample Entity Mapping

727, Wildcard, 727*

737, Wildcard, 737*

See also

Define (for DataTables)

Use this function to specify how a field in the source file maps to a DataTable field.

Syntax:

DEFINE TableFieldName, MapMethod, FieldName|SymName[, MapName, INTERNAL|EXTERNAL]

where:

  • TableFieldName is the name of the DataTable field to which to map the field in the source file.
  • MapMethod is a mapping method and is one of the following:
    Value Description
    MATCH

    The values encountered in field FieldName must match the dimension symbols exactly.

    MAP

    The values encountered in field FieldName will be mapped between dimension symbols via the Map specified by MapName.

    UNIQUE

    Import operations: The SymName will be placed in the corresponding TableFieldName.

  • FieldName|SymName is one of the following:
    Value Description
    FieldName

    This parameter applies only when MapMethod is MATCH or MAP. This can be any alphanumeric identifier, and is used only within the scope of the import action. This parameter must match exactly the FieldName specified in your Set function.

    SymName

    This parameter applies only when MapMethod is UNIQUE. This is the name of a valid symbol for the specified dimension.

  • MapName is required when MapMethod is MAP and is one of the following:
    • the name of a map that exists inside the database, created using the Mappings editor. You must specify INTERNAL as the map location for this type of map.
    • the path and file name, including the relevant extension, of a map that exists as a file, such as a text file on your local machine or network location. If you do not specify the path, the system assumes the path is C:\...\MyDocuments\Longview. You must specify EXTERNAL as the map location for this type of map.

Syntax example:

Define Account, Map, Acc, MyAccountMap, INTERNAL

Define CostCentre, Match, CostCentre

Define Company, Map, CompCode, MyCompanyMap, INTERNAL

Define Currency, Match, Curr

Define PostPeriod, Unique, [[ SYSTEM, SGPCurrentPeriod, DBDEFAULT ]]

Define Amount, Match, Val

Define DebitOrCredit, Match, DRCR

Define PostDate, Match, PostDate

Define MyColumnName, Match, FieldX

It is possible to use a single named field in an Import Spec multiple times to avoid certain external transformations of the source file. To achieve this, you can use multiple define statements which refer to the same FieldName but different map files.

For example, if Field1 in the source file is similar to Account.CostCenter.CompanyCode.Currency (e.g.: 11000.007.123.CAD), you can use Define statements similar to the following:

Define Account, Map, Field1, MyAccountMap, INTERNAL

Define CostCentre, Map, Field1, MyCCMap, INTERNAL

Define Company, Map, Field1, MyCompanyMap, INTERNAL

Define Currency, Map, Field1, MyCurrMap, INTERNAL

DuplicateMappings

Use this Longview Application Framework function to specify how your process handles duplicate mappings. Duplicate mappings can occur when a dimension’s DEFINE function is set to MAP, and the specified field's value (for imports) or symbol name (for exports) for a given record matches multiple maps in the Symbol Map.

Syntax:

DUPLICATEMAPPINGS Action

where:

  • Action is the action to take when the system encounters duplicate mappings in a source file. Select one of the following:
    Value Description

    ALLOW

    Specifies that duplicate maps are permitted. In this case, if multiple EXACT maps are found, all mappings will be used so that the same value is imported to multiple data cells.

    Import operations only: If no EXACT maps are found, the value will get imported to only one location, the one corresponding to the first MAP listed.

    DISALLOW

    Specifies that duplicate maps should not be allowed. The first entry is submitted. If a duplicate map is encountered, the system reports an error.

    Note: This is the default setting.

Syntax example:

DUPLICATEMAPPINGS ALLOW

DuplicateRecords

This function does not apply to ImportSpecs and ExportSpecs for DataTables. Use this Longview Application Framework function to specify how your process handles duplicate records.

Caution: Using the ADD option adds values in the file to the existing DataArea value. Using all other options overwrites existing DataArea values. Longview recommends clearing the DataArea first before using the ADD option.

Syntax:

DUPLICATERECORDS Action

where:

  • Action is the action to take when the system encounters duplicate records in a source file. Select one of the following:
    • For use in both ImportSpecs and ExportSpecs:
      Value Description

      ADD

      Specifies that duplicate records should be allowed, with the value of all such records being summed. In this mode, if a duplicate record is encountered that contains text (versus numeric) data, it will be treated as an invalid/error record.

      DISALLOW

      Specifies that duplicate records should not be allowed. The first entry is submitted. If a duplicate record is encountered, the system reports an error.

      Note: This is the default setting.

    • For use in ImportSpecs only:
      Value Description

      USEFIRST

      Specifies that the first entry of a duplicate record should be used. If a duplicate record is encountered, the system does not report an error.

      USELAST

      Specifies that the last entry of a duplicate record should be used. If a duplicate record is encountered, the system does not report an error.

    • For use in ExportSpecs only:
      Value Description

      ALLOW

      Specifies that duplicate records are allowed, and are all exported to the target.

      Note: This option is not allowed if the Assign or Put functions are used.

Syntax example:

DUPLICATERECORDS USELAST

See also

ErrorFile

Use this Longview Application Framework function to specify the name of the error file to contain data on invalid records.

This will create a text file in the default folder. All rejected records will be captured in this file.

Syntax:

ERRORFILE FileName

where:

  • FileName is the name of the error file to contain data on invalid records. It can include a complete or partial folder path in the format C:\...\FileName. If FileName includes spaces, enclose it in double quotation marks; for example:

    "C:\My Documents\My Data.txt"

Note: If the document is in the same location as lv_af.exe, you do not need to specify the drive or path. If the path specified does not exist, it will be created. The folder path can have a maximum length of 260 characters.

Syntax example:

ERRORFILE error.txt

IgnoreHeaderRecords

This function does not apply to:

  • ExportSpecs
  • ImportSpecs using DataSource JSON

Use this Longview Application Framework function to specify the number of header rows to ignore on import. If you do not include this function in your import specification, the system assumes your import file starts with data records with no header rows.

Syntax

IGNOREHEADERRECORDS Num

where:

  • Num is the number of header records to ignore. Blank lines at the top of the import file are treated as records.

Syntax example:

IGNOREHEADERRECORDS 2

IgnoreFooterRecords

This function does not apply to .

  • ExportSpecs
  • ImportSpecs using DataSource JSON

Use this Longview Application Framework function

to specify the number of footer rows to ignore on import. If you do not include this function in your import specification, the system assumes your import file ends with data records with no footer rows.

Syntax:

IGNOREFOOTERRECORDS Num

where:

  • Num is the number of header records to ignore. Blank lines at the bottom of the import file are treated as records.

Syntax example:

IGNOREFOOTERRECORDS 2

IncludeHeader

This function does not apply when DataTarget uses JSON.

Use this Longview Application Framework function to export a default or specific header rows when exporting from a DataArea or DataTable.

Syntax:

IncludeHeader AUTO|SPECIFY[, "line 1", "line 2", ...]

where:

  • AUTO auto-populates the header with the dimension descriptions for each dimension and VALUE for the value field. If there are multiple value fields, the symbol description will be output instead of VALUE.
  • SPECIFY allows you to specify a custom header. The header can be a single line or multiple lines. Each line must be enclosed in quotes and separated by a comma.

Syntax example:

INCLUDEHEADER AUTOINCLUDEHEADER SPECIFY, "Account, Entity, Time Period, Value"

INCLUDEHEADER SPECIFY, "File created on May 31 2017", "****"

LogFile

Use this Longview Application Framework function to specify the name of the log file to contain information of the processing of the source or target file. This will create a text file in the default folder. All log information will be captured in this file.

Syntax:

LOGFILE FileName

where:

  • FileName is the name of the error file to contain data on invalid records. It can include a complete or partial folder path in the format C:\...\FileName. If FileName includes spaces, enclose it in double quotation marks; for example:

    "C:\My Documents\My Data.txt"

Note: If the document is in the same location as lv_af.exe, you do not need to specify the drive or path. If the path specified does not exist, it will be created. The folder path can have a maximum length of 260 characters.

Syntax examples

LOGFILE log.txt

MaxError

This function does not apply to ImportSpecs and ExportSpecs for DataTables. For DataTables, this function is always set to 1.

Use this Longview Application Framework function to specify the maximum number of error records to permit before stopping a process. This function is optional.

Syntax:

MAXERROR NumErrorRecords

where:

  • NumErrorRecords is the maximum number of error records to permit before stopping an import. The default value for this function is 0.

Syntax examples

MAXERROR 1

Put

This function does not apply to ImportSpecs and ExportSpecs for DataTables.

Use this Longview Application Framework function in both ImportSpecs and ExportSpecs to designate symbol and field names when each external record contains (or will get) multiple value fields, providing a way to associate each value field with specific symbols.

Syntax:

PUT SymName, FieldName

where:

  • SymName is a symbol name in this dimension.
  • FieldName is the field with which this symbol is to be associated.

The PUT function can be used only if the ASSIGN dimension has been specified, and multiple PUT functions should be specified (targeting multiple fields to multiple symbols). This function cannot be used in conjunction with the ValueField function.

See also

QuotedStrings

This function is available only for DataTables.

If you’re importing from or exporting to a .csv file, you may need to enclose string values in double quotation marks. Use this function in an ImportSpec to remove the double quotation marks from string values on import. Similarly, use this function in an ExportSpec to enclose string values in double quotation marks on export to .csv.

Syntax:

QuotedStrings ON|OFF

where:

  • ON encodes/decodes strings with double quotation marks.
  • OFF indicates that string values should be exported or imported as-is.

Syntax examples

QuotedStrings ON

RegionalDates

This function is available only for DataTables.

The system stores dates for DataTable objects and App tables in ISO 8601 format (yyyy-mm-dd). Use this function in an ExportSpec to convert dates from ISO 8601 format to the format specified by the client machine’s regional settings.

Similarly, you can also use this function in an ImportSpec to convert dates from the format specified by the client machine’s regional settings to ISO 8601.

This can be useful when working with DataTable data in a .csv file that you plan to import into your Longview system.

Syntax:

RegionalDates ON|OFF

where:

  • ON converts dates from ISO 8601 to regional format and vice-versa.
  • OFF imports and exports date values without any conversion.

Syntax examples

RegionalDates ON

ReverseSign

This function does not apply to ImportSpecs and ExportSpecs for DataTables.

Use this Longview Application Framework function to toggle logic dealing with sign reversal of numeric values.

When this logic is enabled (ON), the system considers the Account symbol's "balance type" property. If this property indicates a "credit" Account, the numeric value being imported or exported should have its sign reversed.

The default value for ReverseSign is OFF.

Syntax:

REVERSESIGN ON|OFF

Syntax examples

REVERSESIGN ON

ScientificNotation

Use this function in an ImportSpec if the data in your import is in scientific notation format.

Syntax:

ScientificNotation ON|OFF

where:

  • ON enables importing of data in scientific notation format.
  • OFF disables importing of data in scientific notation format.

Syntax examples

ScientificNotation ON

Search

Use this Longview Application Framework function to filter source records based on values of specified fields.

Syntax:

SEARCH SearchExpression

where:

  • SearchExpression consists of one or more relational expressions, linked by AND and OR and grouped by brackets, as appropriate:
    [(] RelationalExpression [AND|OR RelationalExpression [)] [AND|OR …]]

    Each RelationalExpression is created using field names, operators and values, as follows:

    RelationalExpression = FieldName EQ|NE|LE|LT|GE|GT "Value"

    The value parameter may optionally contain wildcards (? and *), and symbols for operators, such as ==, <=, >=, and so on.

    Multiple SEARCH expressions may appear within a single specification object.

Syntax examples

SEARCH "ACCT==??1 OR ACCT==??2"

Set

Use this function to assign unique names identifying the structural elements of the source data file. You must use either the Define or ValueField functions for each Set function.

Syntax:

SET FieldName, Position

where:

  • FieldName can be any alphanumeric identifier, and is only used within the scope of the import action.
  • Position is a parameter that can be a field number (for delimited text files) or an element[.subelement] name (for XML files). Multiple SET functions are used to refer to various fields in the data object, but not all fields require the SET function.

Syntax examples

SET Acct, 1 SET TimePer, 2 SET Ent, 3 SET Cur, 4

StringEncodedSymbolNames

This function is available only for DataTables.

If you’re importing from or exporting to a .csv file, you may need to enclose symbol names in square brackets so that numeric symbol names are not interpreted as numbers (and the leading zeros are retained). Use this function in an ExportSpec to enclose symbol names in square brackets on export to .csv. Similarly, use this function in an ImportSpec to remove the square brackets from symbol names on import.

Note: This logic applies only to symbol names in columns that have the symbol data type.

Syntax:

StringEncodedSymbolNames ON|OFF

where:

  • ON encodes/decodes symbol names with square brackets.
  • OFF indicates that symbol names should be exported or imported as-is.

Syntax examples

StringEncodedSymbolNames ON

SQLBatchSize

Use this function to specify the number of records to retrieve with each fetch against the ODBC data source. The default if this function is not specified is 1000 records. Specifying a larger number of records to retrieve may improve performance.

Note: This function applies only when DataSource is ODBC.

Syntax:

SQLBatchSize Size

where:

  • Size specifies the number of records to retrieve with each fetch against the ODBC data source.

Syntax examples

SQLBatchSize 500

SQLStatement

Use this function to specify the SQL statement to pass along to the ODBC data source.

Note: This function applies only when DataSource is ODBC.

Syntax:

SQLStatement Statement[, SQLFile]

where:

  • Statement is the SQL statement to pass along to the ODBC datasource and can be one of the following:
    Value Description

    “statement”

    Specifies the SQL statement in text, enclosed in double quotation marks.

    FILE

    Specifies that the SQL statement is specified in a file.

  • SQLFile is the file containing the SQL statement, including the extension, and applies only when Statement is FILE. The file name must be relative to the current directory.

Syntax examples

SQLStatement SELECT Currency, Rate FROM Rates

SQLStatement FILE, Sample.sql

ValueField

This function does not apply to ImportSpecs and ExportSpecs for DataTables.

Use this Longview Application Framework function to specify which field will contain the actual data value being imported or exported.

Syntax:

VALUEFIELD FieldName

where:

  • FieldName can be any alphanumeric identifier, and is only used within the scope of the import action.

See also

Published:

ImportSpec and ExportSpec Functions

A function is an instruction word representing an action you want to perform in Longview Application Framework. You can carry out the action by typing a function statement in an ImportSpec or ExportSpec Document in Longview Application Framework.

Note: Unless otherwise specified, the following functions are available for use in both ImportSpecs and ExportSpecs.

The following functions apply to DataAreas:

Assign ErrorFile Put
DataSource (ImportSpec only) IgnoreHeaderRecords ReverseSign
DataTarget (ExportSpec only) IgnoreFooterRecords ScientificNotation
DecimalCharacter IncludeHeader (ExportSpec only) Search
Define (for DataAreas) LogFile Set
DuplicateMappings MaxError ValueField
DuplicateRecords    

The following functions apply to DataTables:

DataSource (ImportSpec only) IgnoreHeaderRecords RegionalDates
DataTarget (ExportSpec only) IgnoreFooterRecords ScientificNotation
DecimalCharacter IncludeHeader (ExportSpec only) Search
Define (for DataTables) LogFile Set
ErrorFile QuotedStrings StringEncodedSymbolNames

Assign

This function does not apply to ImportSpecs and ExportSpecs for DataTables.

Use this Longview Application Framework function to designate dimension names when each external record contains, or will get, multiple value fields, providing a way to associate each value field with specific symbols.

The ASSIGN function can only be used in a single dimension.

Syntax:

ASSIGN DimName

where:

  • DimName is the dimension across which the multiple values will be associated.

The ASSIGN function must be used with the PUT function, and can only be used if the dimension has not been set by DEFINE, and if the VALUEFIELD function has not been used.

See also

DataSource

Use this function with import specification to define the source of the data to be imported. This keyword is available in a ImportSpecs only, is mandatory, and specifies the type and source of data for the import operation. The parameters for the DataSource function depend on the specified type of connection.

Syntax for Text

DataSource SourceType, "FileName", EXTERNAL[, "Delimiter|REGIONAL"]

where:

  • SourceType is one of the following:
    Parameter Description

    TEXT

    Imports the data from a delimited text file.

    JSON

    Imports the data from a JSON formatted file. The JSON file can be either verbose or concise format. The import will automatically determine which one it is.

    ODBC

    Imports the data directly from a database using ODBC connection.

  • FileName is the name of the source file and it must be enclosed in double quotation marks.

    Note: By default, ODBC connections through Application Framework are allowed. To restrict all ODBC connections through Application Framework, open the lv_af.cfg file found in the working directory of the lv_af.exe and add the following statement to the end of the document: PERMIT_DATASOURCE_ODBC=FALSE

  • Delimiter is the character that is used to separate the fields in the source file and must be enclosed in double quotation marks. If not specified, the default delimiter is {. Alternatively, you use the keyword REGIONAL, which will use the client machine’s regional settings to determine the delimiter character to use. Use this parameter when using the TEXT.

Example:

DataSource TEXT, "MyData", EXTERNAL, "{"

DataSource JSON, “MyData”, EXTERNAL DataSource ODBC, "Driver={SQL Server Native Client10.0};Server=localhost;Database=mydb ;Trusted_Connection=Yes;"

See also

DataTarget

Use this function with export specification to define the target of the data to be exported. This keyword is available in a Data Export Specification only, is mandatory, and specifies the target and type of data for the export operation.

Syntax:

DataTarget TargetType, "FileName", EXTERNAL, ["Delimiter|REGIONAL"] [,VERBOSE|CONCISE]

where:

  • TargetType is one of the following:

    Note: DataTables do not support ODS or JSON.

    Parameter Description

    TEXT

    Exports the data to a text file. You must use this keyword if you are exporting from a DataTable..

    JSON

    Exports the data in JSON format. The JSON file can be either verbose or concise format.

    ODBC

    Exports the data to an ODS file. For more information on ODS, see the “Longview Integration Guide”.

  • "FileName" is the name of the source file and must be enclosed in double quotation marks.

    Note: If the document is in the same location as lv_af.exe, you do not need to specify the drive or path. If the path specified does not exist, it will be created. The folder path can have a maximum length of 260 characters.

  • Delimiter is the character used to separate the fields in the ouput file and must be enclosed in double quotation marks. If not specified, the default delimiter is {. Alternatively, you use the keyword REGIONAL, which will use the client machine’s regional settings to determine the delimiter character to use. Use this parameter when using the TEXT.

    VERBOSE|CONCISE is the json format the data will be output as. This is an optional parameter and will default to VERBOSE if not specified. For more information on the output type VERBOSE and CONCISE see Sample JSON source files and ImportSpec for DataArea Import.

Syntax example:

DataTarget TEXT, "MyData", INTERNALEXTERNAL

DataTarget TEXT, "MyDataTable", EXTERNALINTERNAL, "REGIONAL"

DataTarget JSON, “MyData”, EXTERNAL, CONCISE

DataTarget JSON, “MyData”, EXTERNAL

DecimalCharacter

This function does not apply when JSON is used for DataSource or DataTarget.

Use this function to specify whether the character that is used as a decimal character, in the source or target file, is a comma or a period.

Syntax:

DecimalCharacter "Character|REGIONAL"

where:

  • Character is the decimal character for the import or export file and can be either a period ( . ) or a comma ( , ). The default decimal character is a period ( . ).
  • REGIONAL specifies to use the client machine’s regional settings to determine the decimal character for import and export. This keyword must be enclosed in double quotation marks.

Syntax example:

DecimalCharacter "," DecimalCharacter "REGIONAL"

Define (for DataAreas)

Use this function to specify how a field in the source file will map to the dimensions.

Syntax:

DEFINE DimName, MapMethod, FieldName|SymName[, MapName, INTERNAL|EXTERNAL]

where:

  • DimName is the dimension being defined.
  • MapMethod is a mapping method and is one of the following:
    Value Description

    MATCH

    The values encountered in field FieldName must match the dimension symbols exactly.

    MAP

    The values encountered in field FieldName will be mapped between dimension symbols via the Map specified by MapName.

    UNIQUE

    Import operations: all values encountered will be placed in the symbol SymName (symbol names are not derived from any source field).

    Export operations: only values in the source DataArea belonging to the symbol SymName will be exported.

    CONSECUTIVE

    Values will be placed consecutively (in the order they are encountered) in the dimension starting at symbol SymName, moving in priority order through that symbol's siblings. CONSECUTIVE is designed to work with a symbol existing in only one hierarchy - if the symbol belongs to more than one hierarchy, the behavior used to determine the siblings is undefined and may be affected by hierarchy reorganizations and import/exports. If the dimension is a schedule dimension, there is no concept of hierarchy or priority, therefore symbols will be used in defined order. If the end of the sibling or symbol list is reached, the next value will generate an error.

    Note: The CONSECUTIVE parameter is available for use in ImportSpecs only.

  • FieldName|SymName is one of the following:
    Value Description
    FieldName

    This parameter applies only when MapMethod is MATCH or MAP.

    This can be any alphanumeric identifier, and is used only within the scope of the import action. This parameter must match exactly the FieldName specified in your Set function.

    SymName

    This parameter applies only when MapMethod is UNIQUE or CONSECUTIVE.

    This is the name of a valid symbol for the specified dimension.

  • MapName is required when MapMethod is MAP and is one of the following:
    • the name of a map that exists inside the database, created using the Mappings editor. You must specify INTERNAL as the map location for this type of map.
    • the path and file name, including the relevant extension, of a map that exists as a file, such as a text file on your local machine or network location. If you do not specify the path, the system assumes the path is C:\...\MyDocuments\Longview. You must specify EXTERNAL as the map location for this type of map.

A valid ImportSpec or ExportSpec must contain a DEFINE or ASSIGN for each base dimension in the DataArea. If the DataArea contains schedule data, each extra dimension must also relate to a DEFINE or ASSIGN.

Syntax example:

DEFINE Accounts, MATCH, Acct

DEFINE TimePeriods, MAP, TimePer, TimePeriodsMap, INTERNAL DEFINE Entities, MAP, Ent, MyMaps\EntitiesMap.txt, EXTERNAL

DEFINE Currencies, UNIQUE, USD

DEFINE Details, CONSECUTIVE, AgingT

It is possible to use a single named field in an Import Spec with one or more keywords (Define, Put, ValueField) to avoid outside transformation of the source file.

To achieve this, you can use multiple define statements which refer to the same fieldname but different map files. For example:

Sample Data File

72740000, Bike, 123.45

72740001, Skateboard, 987.65

73740000, Bike, 12.34

73740001, Skateboard, 98.76

Sample ImportSpec

Set AccEnt, 1

Set Tim, 2

Set Det, 4

Set Cur, 5

Set Seg, 6

Set Ele, 7

Set Con, 8

Set Val, 9

Define ACCOUNTS, Map, AccEnt, AccountMapping, INTERNAL

Define TIMEPER, Match, Tim

Define ENTITIES, Map, AccEnt, EntityMapping, INTERNAL

Define DETAILS, Match, Det

Define CURRENCY, Match, Cur

Define SEGMENTS, Match, Seg

Define ELEMENTS, Match, Ele

Define CONTROLS, Match, Con

ValueField Val

Sample AccountMapping

40000, Wildcard, ???40000

40001, Wildcard, ???40001

Sample Entity Mapping

727, Wildcard, 727*

737, Wildcard, 737*

See also

Define (for DataTables)

Use this function to specify how a field in the source file maps to a DataTable field.

Syntax:

DEFINE TableFieldName, MapMethod, FieldName|SymName[, MapName, INTERNAL|EXTERNAL]

where:

  • TableFieldName is the name of the DataTable field to which to map the field in the source file.
  • MapMethod is a mapping method and is one of the following:
    Value Description
    MATCH

    The values encountered in field FieldName must match the dimension symbols exactly.

    MAP

    The values encountered in field FieldName will be mapped between dimension symbols via the Map specified by MapName.

    UNIQUE

    Import operations: The SymName will be placed in the corresponding TableFieldName.

  • FieldName|SymName is one of the following:
    Value Description
    FieldName

    This parameter applies only when MapMethod is MATCH or MAP. This can be any alphanumeric identifier, and is used only within the scope of the import action. This parameter must match exactly the FieldName specified in your Set function.

    SymName

    This parameter applies only when MapMethod is UNIQUE. This is the name of a valid symbol for the specified dimension.

  • MapName is required when MapMethod is MAP and is one of the following:
    • the name of a map that exists inside the database, created using the Mappings editor. You must specify INTERNAL as the map location for this type of map.
    • the path and file name, including the relevant extension, of a map that exists as a file, such as a text file on your local machine or network location. If you do not specify the path, the system assumes the path is C:\...\MyDocuments\Longview. You must specify EXTERNAL as the map location for this type of map.

Syntax example:

Define Account, Map, Acc, MyAccountMap, INTERNAL

Define CostCentre, Match, CostCentre

Define Company, Map, CompCode, MyCompanyMap, INTERNAL

Define Currency, Match, Curr

Define PostPeriod, Unique, [[ SYSTEM, SGPCurrentPeriod, DBDEFAULT ]]

Define Amount, Match, Val

Define DebitOrCredit, Match, DRCR

Define PostDate, Match, PostDate

Define MyColumnName, Match, FieldX

It is possible to use a single named field in an Import Spec multiple times to avoid certain external transformations of the source file. To achieve this, you can use multiple define statements which refer to the same FieldName but different map files.

For example, if Field1 in the source file is similar to Account.CostCenter.CompanyCode.Currency (e.g.: 11000.007.123.CAD), you can use Define statements similar to the following:

Define Account, Map, Field1, MyAccountMap, INTERNAL

Define CostCentre, Map, Field1, MyCCMap, INTERNAL

Define Company, Map, Field1, MyCompanyMap, INTERNAL

Define Currency, Map, Field1, MyCurrMap, INTERNAL

DuplicateMappings

Use this Longview Application Framework function to specify how your process handles duplicate mappings. Duplicate mappings can occur when a dimension’s DEFINE function is set to MAP, and the specified field's value (for imports) or symbol name (for exports) for a given record matches multiple maps in the Symbol Map.

Syntax:

DUPLICATEMAPPINGS Action

where:

  • Action is the action to take when the system encounters duplicate mappings in a source file. Select one of the following:
    Value Description

    ALLOW

    Specifies that duplicate maps are permitted. In this case, if multiple EXACT maps are found, all mappings will be used so that the same value is imported to multiple data cells.

    Import operations only: If no EXACT maps are found, the value will get imported to only one location, the one corresponding to the first MAP listed.

    DISALLOW

    Specifies that duplicate maps should not be allowed. The first entry is submitted. If a duplicate map is encountered, the system reports an error.

    Note: This is the default setting.

Syntax example:

DUPLICATEMAPPINGS ALLOW

DuplicateRecords

This function does not apply to ImportSpecs and ExportSpecs for DataTables. Use this Longview Application Framework function to specify how your process handles duplicate records.

Caution: Using the ADD option adds values in the file to the existing DataArea value. Using all other options overwrites existing DataArea values. Longview recommends clearing the DataArea first before using the ADD option.

Syntax:

DUPLICATERECORDS Action

where:

  • Action is the action to take when the system encounters duplicate records in a source file. Select one of the following:
    • For use in both ImportSpecs and ExportSpecs:
      Value Description

      ADD

      Specifies that duplicate records should be allowed, with the value of all such records being summed. In this mode, if a duplicate record is encountered that contains text (versus numeric) data, it will be treated as an invalid/error record.

      DISALLOW

      Specifies that duplicate records should not be allowed. The first entry is submitted. If a duplicate record is encountered, the system reports an error.

      Note: This is the default setting.

    • For use in ImportSpecs only:
      Value Description

      USEFIRST

      Specifies that the first entry of a duplicate record should be used. If a duplicate record is encountered, the system does not report an error.

      USELAST

      Specifies that the last entry of a duplicate record should be used. If a duplicate record is encountered, the system does not report an error.

    • For use in ExportSpecs only:
      Value Description

      ALLOW

      Specifies that duplicate records are allowed, and are all exported to the target.

      Note: This option is not allowed if the Assign or Put functions are used.

Syntax example:

DUPLICATERECORDS USELAST

See also

ErrorFile

Use this Longview Application Framework function to specify the name of the error file to contain data on invalid records.

This will create a text file in the default folder. All rejected records will be captured in this file.

Syntax:

ERRORFILE FileName

where:

  • FileName is the name of the error file to contain data on invalid records. It can include a complete or partial folder path in the format C:\...\FileName. If FileName includes spaces, enclose it in double quotation marks; for example:

    "C:\My Documents\My Data.txt"

Note: If the document is in the same location as lv_af.exe, you do not need to specify the drive or path. If the path specified does not exist, it will be created. The folder path can have a maximum length of 260 characters.

Syntax example:

ERRORFILE error.txt

IgnoreHeaderRecords

This function does not apply to:

  • ExportSpecs
  • ImportSpecs using DataSource JSON

Use this Longview Application Framework function to specify the number of header rows to ignore on import. If you do not include this function in your import specification, the system assumes your import file starts with data records with no header rows.

Syntax

IGNOREHEADERRECORDS Num

where:

  • Num is the number of header records to ignore. Blank lines at the top of the import file are treated as records.

Syntax example:

IGNOREHEADERRECORDS 2

IgnoreFooterRecords

This function does not apply to .

  • ExportSpecs
  • ImportSpecs using DataSource JSON

Use this Longview Application Framework function

to specify the number of footer rows to ignore on import. If you do not include this function in your import specification, the system assumes your import file ends with data records with no footer rows.

Syntax:

IGNOREFOOTERRECORDS Num

where:

  • Num is the number of header records to ignore. Blank lines at the bottom of the import file are treated as records.

Syntax example:

IGNOREFOOTERRECORDS 2

IncludeHeader

This function does not apply when DataTarget uses JSON.

Use this Longview Application Framework function to export a default or specific header rows when exporting from a DataArea or DataTable.

Syntax:

IncludeHeader AUTO|SPECIFY[, "line 1", "line 2", ...]

where:

  • AUTO auto-populates the header with the dimension descriptions for each dimension and VALUE for the value field. If there are multiple value fields, the symbol description will be output instead of VALUE.
  • SPECIFY allows you to specify a custom header. The header can be a single line or multiple lines. Each line must be enclosed in quotes and separated by a comma.

Syntax example:

INCLUDEHEADER AUTOINCLUDEHEADER SPECIFY, "Account, Entity, Time Period, Value"

INCLUDEHEADER SPECIFY, "File created on May 31 2017", "****"

LogFile

Use this Longview Application Framework function to specify the name of the log file to contain information of the processing of the source or target file. This will create a text file in the default folder. All log information will be captured in this file.

Syntax:

LOGFILE FileName

where:

  • FileName is the name of the error file to contain data on invalid records. It can include a complete or partial folder path in the format C:\...\FileName. If FileName includes spaces, enclose it in double quotation marks; for example:

    "C:\My Documents\My Data.txt"

Note: If the document is in the same location as lv_af.exe, you do not need to specify the drive or path. If the path specified does not exist, it will be created. The folder path can have a maximum length of 260 characters.

Syntax examples

LOGFILE log.txt

MaxError

This function does not apply to ImportSpecs and ExportSpecs for DataTables. For DataTables, this function is always set to 1.

Use this Longview Application Framework function to specify the maximum number of error records to permit before stopping a process. This function is optional.

Syntax:

MAXERROR NumErrorRecords

where:

  • NumErrorRecords is the maximum number of error records to permit before stopping an import. The default value for this function is 0.

Syntax examples

MAXERROR 1

Put

This function does not apply to ImportSpecs and ExportSpecs for DataTables.

Use this Longview Application Framework function in both ImportSpecs and ExportSpecs to designate symbol and field names when each external record contains (or will get) multiple value fields, providing a way to associate each value field with specific symbols.

Syntax:

PUT SymName, FieldName

where:

  • SymName is a symbol name in this dimension.
  • FieldName is the field with which this symbol is to be associated.

The PUT function can be used only if the ASSIGN dimension has been specified, and multiple PUT functions should be specified (targeting multiple fields to multiple symbols). This function cannot be used in conjunction with the ValueField function.

See also

QuotedStrings

This function is available only for DataTables.

If you’re importing from or exporting to a .csv file, you may need to enclose string values in double quotation marks. Use this function in an ImportSpec to remove the double quotation marks from string values on import. Similarly, use this function in an ExportSpec to enclose string values in double quotation marks on export to .csv.

Syntax:

QuotedStrings ON|OFF

where:

  • ON encodes/decodes strings with double quotation marks.
  • OFF indicates that string values should be exported or imported as-is.

Syntax examples

QuotedStrings ON

RegionalDates

This function is available only for DataTables.

The system stores dates for DataTable objects and App tables in ISO 8601 format (yyyy-mm-dd). Use this function in an ExportSpec to convert dates from ISO 8601 format to the format specified by the client machine’s regional settings.

Similarly, you can also use this function in an ImportSpec to convert dates from the format specified by the client machine’s regional settings to ISO 8601.

This can be useful when working with DataTable data in a .csv file that you plan to import into your Longview system.

Syntax:

RegionalDates ON|OFF

where:

  • ON converts dates from ISO 8601 to regional format and vice-versa.
  • OFF imports and exports date values without any conversion.

Syntax examples

RegionalDates ON

ReverseSign

This function does not apply to ImportSpecs and ExportSpecs for DataTables.

Use this Longview Application Framework function to toggle logic dealing with sign reversal of numeric values.

When this logic is enabled (ON), the system considers the Account symbol's "balance type" property. If this property indicates a "credit" Account, the numeric value being imported or exported should have its sign reversed.

The default value for ReverseSign is OFF.

Syntax:

REVERSESIGN ON|OFF

Syntax examples

REVERSESIGN ON

ScientificNotation

Use this function in an ImportSpec if the data in your import is in scientific notation format.

Syntax:

ScientificNotation ON|OFF

where:

  • ON enables importing of data in scientific notation format.
  • OFF disables importing of data in scientific notation format.

Syntax examples

ScientificNotation ON

Search

Use this Longview Application Framework function to filter source records based on values of specified fields.

Syntax:

SEARCH SearchExpression

where:

  • SearchExpression consists of one or more relational expressions, linked by AND and OR and grouped by brackets, as appropriate:
    [(] RelationalExpression [AND|OR RelationalExpression [)] [AND|OR …]]

    Each RelationalExpression is created using field names, operators and values, as follows:

    RelationalExpression = FieldName EQ|NE|LE|LT|GE|GT "Value"

    The value parameter may optionally contain wildcards (? and *), and symbols for operators, such as ==, <=, >=, and so on.

    Multiple SEARCH expressions may appear within a single specification object.

Syntax examples

SEARCH "ACCT==??1 OR ACCT==??2"

Set

Use this function to assign unique names identifying the structural elements of the source data file. You must use either the Define or ValueField functions for each Set function.

Syntax:

SET FieldName, Position

where:

  • FieldName can be any alphanumeric identifier, and is only used within the scope of the import action.
  • Position is a parameter that can be a field number (for delimited text files) or an element[.subelement] name (for XML files). Multiple SET functions are used to refer to various fields in the data object, but not all fields require the SET function.

Syntax examples

SET Acct, 1 SET TimePer, 2 SET Ent, 3 SET Cur, 4

StringEncodedSymbolNames

This function is available only for DataTables.

If you’re importing from or exporting to a .csv file, you may need to enclose symbol names in square brackets so that numeric symbol names are not interpreted as numbers (and the leading zeros are retained). Use this function in an ExportSpec to enclose symbol names in square brackets on export to .csv. Similarly, use this function in an ImportSpec to remove the square brackets from symbol names on import.

Note: This logic applies only to symbol names in columns that have the symbol data type.

Syntax:

StringEncodedSymbolNames ON|OFF

where:

  • ON encodes/decodes symbol names with square brackets.
  • OFF indicates that symbol names should be exported or imported as-is.

Syntax examples

StringEncodedSymbolNames ON

SQLBatchSize

Use this function to specify the number of records to retrieve with each fetch against the ODBC data source. The default if this function is not specified is 1000 records. Specifying a larger number of records to retrieve may improve performance.

Note: This function applies only when DataSource is ODBC.

Syntax:

SQLBatchSize Size

where:

  • Size specifies the number of records to retrieve with each fetch against the ODBC data source.

Syntax examples

SQLBatchSize 500

SQLStatement

Use this function to specify the SQL statement to pass along to the ODBC data source.

Note: This function applies only when DataSource is ODBC.

Syntax:

SQLStatement Statement[, SQLFile]

where:

  • Statement is the SQL statement to pass along to the ODBC datasource and can be one of the following:
    Value Description

    “statement”

    Specifies the SQL statement in text, enclosed in double quotation marks.

    FILE

    Specifies that the SQL statement is specified in a file.

  • SQLFile is the file containing the SQL statement, including the extension, and applies only when Statement is FILE. The file name must be relative to the current directory.

Syntax examples

SQLStatement SELECT Currency, Rate FROM Rates

SQLStatement FILE, Sample.sql

ValueField

This function does not apply to ImportSpecs and ExportSpecs for DataTables.

Use this Longview Application Framework function to specify which field will contain the actual data value being imported or exported.

Syntax:

VALUEFIELD FieldName

where:

  • FieldName can be any alphanumeric identifier, and is only used within the scope of the import action.

See also

For an optimal Community experience, Please view on Desktop