JDBC Driver Supported Databases

Uploading Data from JDBC Driver Supported Databases using the Upload Tool

Zoho Analytics Upload Tool is a downloadable standalone utility, which you can install in your local environment and connect to upload data from your local databases behind firewall into Zoho Analytics. If you have stored your data in JDBC supported database, such as Teradata, Informix, HP Vertica etc., then you can use Upload Tool to pull data from your databases and upload the same into Zoho Analytics.

In this section, we will discuss about how to connect the Upload Tool to pull data from local/hosted JDBC supported database and import into Zoho Analytics. You can also read about how to configure the Upload tool to periodically upload/synchronize the data from your databases into Zoho Analytics.

Downloading and Installing Upload Tool

Zoho Analytics Upload Tool can be downloaded from the link below.

https://www.zoho.com/analytics/help/upload-tool/download-and-setup.html#download-tool

You can easily install the tool by extracting the file in the required location, from where you can connect to your local database and Zoho Analytics Workspace through internet. To know more details on installing Upload Tool, refer to the following link.

https://www.zoho.com/analytics/help/upload-tool/download-and-setup.html#install

Setting up JDBC Driver

You can connect your local/hosted database using JDBC Driver by copying the related JDBC driver(JDBC jar file) of the database into the /lib directory of the Upload Tool. You need to download the corresponding JDBC driver provided by the database vendor. Refer to the vendor's database documentation for details.

Configuring the Upload Tool

The conf directory contains all the configuration files for the tool. Before you execute the tool you need to specify the appropriate configuration settings. The /conf directory consists of three files.

  • common_params.conf - Provides connection parameters to specify the details of the Zoho Analytics account into which the data need to be uploaded.
  • database_connection_params.conf - Provides database connection parameters to specify the details of the local/hosted JDBC supported database from which data need to be uploaded.
  • database_sql_queries.xml - Allows you to specify the SQL SELECT queries that are to be used to fetch the data from your local/hosted JDBC supported database and upload the same into Zoho Analytics Workspace.

Connection Parameters

The common_params.conf file provides connection parameters to specify the details of the Zoho Analytics account into which the data need to be uploaded. Apart from the Zoho Analytics Connection Parameters, this also contains Import Parameters and proxy server details.

Common Settings

In the common_params.conf file, specify the following parameters. This allows the Upload Tool to connect the REPORT_SERVER_URL  after being authenticated by the IAM_SERVER_URL

ParameterDescriptionMandatory/Optional

IAM_SERVER_URL

Specify the URL of the Zoho Authentication server based on your data center.

  • US: https://accounts.zoho.com.
  • EU: https://accounts.zoho.eu
  • IN:  https://accounts.zoho.in
  • AU:  https://accounts.zoho.com.au
  • CN:  https://accounts.zoho.com.cn

Mandatory

REPORT_SERVER_URL

Specify the Zoho Analytics service URL based on your data center.

  • US: https://analyticsapi.zoho.com
  • EU: https://analyticsapi.zoho.eu
  • IN: https://analyticsapi.zoho.in
  • AU: https://analyticsapi.zoho.com.au
  • CN: https://analyticsapi.zoho.com.cn

Mandatory

Zoho Analytics Connection Parameters

The following table contains connection parameters for Zoho Analytics.

ParameterDescriptionMandatory/Optional

EMAIL_ADDRESS

Specify your Zoho Analytics account E-mail ID

Mandatory

  • CLIENT_ID
  • CLIENT_SECRET
  • REFRESH_TOKEN
Zoho Analytics Upload Tool supports OAuth 2.0 protocol to authorize and authenticate the user. Refer this link (https://www.zoho.com/analytics/api/#oauth) to generate CLIENT_ID, CLIENT_SECRET, REFRESH_TOKEN.

Mandatory

DBOWNER_EMAIL_ADDRESS

Specify the Zoho Analytics account Admin E-mail ID here.

Mandatory

To connect the tool to the internet through a proxy server, you need to provide the following. If you are using a Direct Internet connection, then this setting can be ignored.

ParameterDescriptionMandatory/Optional

USEPROXY

Set this to true, if you are connected through a Proxy server.

Set this to false, if you are connected to the Internet directly. By default this value is set to false.

Mandatory

PROXYHOSTSpecify the machine name or IP address where the proxy server is running in your network.

Mandatory

PROXYPORT

Specify the port in which the proxy server is running.

Mandatory
PROXYUSERNAME

Specify your proxy server user name to access the proxy server.

Mandatory
PROXYPASSWORD

Specify your proxy password to access the proxy server.

Mandatory

Import Parameters

These import settings help Zoho Analytics properly understand the data being imported. You need not have to configure these parameters unless you want to fine tune the import process. The default settings should suffice for most cases.

These parameters are available in the common_params.conf file. To know more about the available Import Parameters and how to set them, refer here.

Database Connection Parameters

You can configure the settings to connect to your local/hosted JDBC supported database in the database_connection_params.conf file.

Specify the following connection parameters.

DBTYPESpecify your database name in this field (Eg., teradata or informix or vertica or greenplum etc.,)
DRIVERCLASSNAMESpecify the JDBC driver class name of the database (refer to the table in the below section)
CONNECTIONURLSpecify database URL with or without username and password to establish connection with the local database (refer to the table above).
USERNAMESpecify the USERNAME to access the database. This is not required if you have specified the authentication details in CONNECTIONURL itself.
PASSWORSDSpecify the PASSWORD to access the database. This is not required if you have specified the authentication details in CONNECTIONURL itself.

The following screenshot illustrates the connection settings for Teradata database. 

JDBC Driver Supported Databases

You can connect to your local/hosted database, which supports JDBC driver using the method specified here. The following table consists of the default connection settings for the commonly used databases.

DatabaseDBTYPEDriver Class NameConnection URLPORT (default)
Teradatateradatacom.teradata.jdbc.
TeraDriver
jdbc:teradata://<host>
[/ParameterName=Value,
ParameterName=
Value,.....]
1025
Informixinformixcom.informix.jdbc.
IfxDriver
jdbc:informix-sqli
://<host>
:<port>/<dbname>
:informixserver=<dbservername>
1533
HP Verticahpverticacom.vertica.jdbc.Driverjdbc:vertica://
<host>:<port>/
<dbname>
5433
Ingresingrescom.ingres.jdbc.
IngresDriver
jdbc:ingres://
<host>:<port>/
<dbname>
21071
Greenplumgreenplumorg.postgresql.Driverjdbc:postgresql://
<host>:<port>/
<dbname>
5342
SQL Anywheresqlanywherecom.sybase.
jdbc4.jdbc.SybDriver
jdbc:sybase:Tds:
<host>:<port>?
ServiceName=<dbname>
2638
Derbyderbyorg.apache.derby.
jdbc.ClientDriver
jdbc:derby:net://
<host>:<port>/
<dbname>
1527
H2h2org.h2.Driverjdbc:h2:tcp://
<host>/<dbname>
9092
Cachecachecom.intersys.jdbc.
CacheDriver
jdbc:Cache://
<host>:<port>/<namespace>
1972
Progress Openedgeprogressopenedgecom.ddtek.jdbc.
openedge.
OpenEdgeDriver
jdbc:datadirect:
openedge://
<host>;
databaseName=<dbname>
9092
Cubridcubridcubrid.jdbc.driver.
CUBRIDDriver
jdbc:cubrid:
<host>:<port>:<dbname>:::
33000
Mimer SQLmimersqlcom.mimer.jdbc.Driverjdbc:mimer:[//[username
[:password]
@]host[:port]] [/dbname]
1360
HSQLDBhsqldborg.hsqldb.jdbcDriverjdbc:hsqldb
:hsql://<host>:
<port>/<alias>
9001
Mckomckoicom.mckoi.JDBCDriverjdbc:mckoi:
//host:port/schema/
9157
FileMaker Profilemakerprocom.filemaker.
jdbc.Driver
jdbc:filemaker://<host>/<dbname>2399

Database SQL Queries

The Database sql queries file, namely database_sql_queries.xml, allows you to specify the SQL SELECT queries that are to be used to fetch the data from your local/hosted JDBC supported database and upload the same into Zoho Analytics Workspace. You can specify any number of queries to fetch to upload data as required.

The following is the query format.

<Query id="" dbname="ReportsDBName1" tablename="ReportsTableName1"  importtype="UPDATEADD" matchingcols="Date, Customer Name" selectcols="" skiptop="" batchsize="10000" queuesize="">
  select * from local_db_table
</Query>

In the above query format the sample values for the parameters are given in bold. You can replace these bold content with your values. The following table explains parameters for the sql queries.

sql_query

Specify the SQL SELECT query to be executed in the local/hosted MySQL database for fetching the necessary data.

Example 1:

select * from local_db_table

This query fetches all the record from the local_db_table

Example 2:

select * from employee where age  &gt; 25

This query fetches all the record from the employee table whose age is greater than 25

Please note that the '<' symbol in the criteria should be replaced with &lt; and '>' symbol in the criteria should be replaced with &gt;

dbname

The Zoho Analytics workspace name into which the data is to be uploaded after executing the SQL Query.

Note: Ensure that the Workspace is already available in Zoho Analytics account. If it does not exist create the database before executing this upload to avoid failure. Refer Creating a New Workspace to know how to create a Workspace.

tablename

The Zoho Analytics table name into which the data is to be uploaded after executing the SQL Query.

Ensure that the specified table with similar column structure is already created in the Zoho Analytics Workspace. Refer Creating a Table to know how to create a table.

You also allow Zoho Analytics to create the table by setting the ZOHO_CREATE_TABLE parameter to true in the common_params.conf file.

importtype

Sets how the data need to be imported.

Available options are :

  • APPEND - appends the data to the end of the table.
  • UPDATEADD - updates existing data records and appends new data records. For this you need to configure ZOHO_MATCHING_COLUMNS in the common_params.conf of the conf directory.
  • TRUNCATEADD - Deletes the existing data and adds new data.
matchingcols

This is applicable only when the importtype is set to UPDATEADD. Specify column ( or a combination of columns) whose value will uniquely identify each record in that table. If the record already exists in the table then it will be replaced with the new values in the data being uploaded. Otherwise the data will be added at the end of the table.

selectcols

The column names separated by comma. Only these columns are uploaded from the resultant query data into the online Workspace.

Leave this as "" (empty) if you want all the columns to be uploaded.

skiptop

The number of rows to be skipped from the top in the resultant query data before being uploaded. Leave this as "" (empty) if you want all the rows from the resultant query data to be uploaded.

batchsize

Upload Tool splits the data into batches for uploading. The batchsize param splits the huge data into batches for uploading. The default value for the batchsize param is 10000.

This is a replacement for "LINES_TO_SEND" parameter which can be found in common_params.conf file. The param "LINES_TO_SEND" is deprecated in the latest version of the upload tool.

queuesize

Upload Tool splits the data into batches for uploading and writes the data in the temporary local file. The queuesize helps to regulate the file generations. The default value is nill (which means, there is no limit in file generation).

Executing the Upload Tool

Once you have configured the parameters, you can execute the Upload Tool by invoking the below file available in the bin folder.

  • In Windows OS - UploadFromDB.bat
  • Mac/Linux OS - UploadFromDB.sh

You can also execute the Upload Tool using the command line.

The following is an example command to invoke the Upload Tool.

UploadFromDB.bat

You can also specify the authentication details such as Zoho Analytics Login e-mail ID, OAuth Authentication Parameters, and Workspace Administrator e-mail ID in the command line. However, specifying these details in the common_params.conf file reduces the tedious process of repeating these details every time in your command line arguments.

In case you specify these values in the command line apart from specifying in the common_params.conf file, then the values provided in the command line will take precedence.

The following is an example command to invoke the Upload Tool along with user e-mail ID and OAuth Authentication Parameters.

UploadFromDB.bat <email address> -CID <ClientId> -CSECRET <ClientSecret> -TOKEN <RefreshToken>

The following is an example command to invoke the Upload Tool along with user e-mail ID, OAuth Authentication Parameters, and Workspace Administrator e-mail ID for shared users.

UploadFromDB.bat <email address> -CID <ClientId> -CSECRET <ClientSecret> -TOKEN <RefreshToken> -D <database_owner_login_e-mail_address>

Note Mac users:

Mac users need to define the following settings in the setEnv.sh file of the Upload Tool's /bin folder. The upload tool that you download will be same as that of the Linux version. You need to make the following changes before using the tool.
  • After you unzip the Upload Tool files, open a terminal/command-line and change the directory to /bin.
  • Type the following command at the command prompt.
    whereis java
  • Copy the java location returned by your system.
  • Open the setEnv.sh file and remove the following snippet.

           export JAVA_HOME=$TOOL_HOME/jre
  • Add the following snippet.
      export JAVABIN=<paste_your_mac_os_java_location>
     Example
    export JAVABIN=/usr/bin/java
  • Save the setEnv.sh file.

Setting up Periodic Upload / Synchronization

Using Upload Tool you can periodically schedule data uploads from your local/hosted JDBC supported database to Zoho Analytics. This enables you to have the latest data from your application synced peridoically into Zoho Analytics, and hence reports created over this data stays current.

You can setup a periodic scheduler using the Operating System Scheduler feature as explained below.

Setting up Schedule in Windows Operating System

  1. Click Start > Settings > Control Panel >Scheduled Tasks.
  2. Click Add Scheduled Task. The Schedule Task wizard will open.
  3. Click Action > Create Task to open the Create Task dialog.
  4. Open Action tab and then click New.
  5. Click Browse button and select the upload tool command line batch file UploadFromDB.bat in the Program/Script field. Ensure that you have provided the necessary settings in the Upload tool configuration files.
  6. In the Add Argument field enter the following command line arguments.
    <zoho_login_email_address> -CID <ClientId> -CSECRET <ClientSecret> -TOKEN <RefreshToken>

    Example

    eduardo@zillum.com -CID 1000.123456.789012 -CSECRET 12345678901234 -TOKEN 1000.1234.1234

  7. Specify the time to trigger the task in Trigger tab.
  8. Click OK to save the task.

Setting up Schedule in Linux or Mac Operating System

In Linux and Mac, you can use the crontab command for scheduling the migration process using the command line script UploadFromDB.sh (Checkout Simple Help on Linux Crontab command)

The following steps explains how to setup the cron utility for scheduling data upload at a specific interval.

  1. Open the terminal and type the below command.
    crontab -e
  2. Append the prarmeters as a command in the following format :
    MIN HOURS * * * /UploadToolHome/bin/UploadFromDB.sh username -CID <ClientId> -CSECRET <ClientSecret> -TOKEN <RefreshToken>
    Note: * operator is used to specify all possible values for a field.
    Example
    30 8 * * * /UploadToolHome/bin/UploadFromDB.sh eduardo@zillum.com -CID 1000.123456.789012 -CSECRET 12345678901234 -TOKEN 1000.1234.1234
  3. The above script will execute the Upload Tool at 8:30 AM every day and upload the data from your JDBC supported database.
  4. Save the file.

Syntax of crontab

The following are the crontab syntax used to schedule Upload Tool.

MIN HOUR DOM MON DOW CMD ARG1 ARG2 ..

  • MIN: Minutes (0 to 59)
  • HOUR: Hours (0 to 23)
  • DOM: Day of Month (1 to 31)
  • MON: Month Field (1 to 12)
  • DOW: Day of Week (0 to 6)
  • CMD: Command to execute. For scheduling data upload, specify the absolute path of UploadFromDB.sh file
  • ARG: Arguments/Parameter will pass to script. It is an optional.

Click here to know more about the cron jobs.

Points to consider, while scheduling periodic upload using Command line mode:

  • It is the users responsibility to ensure that the latest data is uploaded into Zoho Analytics by setting the schedule interval accordingly. Also ensure that you have provided the appropriate SQL Select query to pull the data.
  • In case you have deleted few records in the tables in your local database being uploaded, the only option to remove these records from Zoho Analytics database is to set the Import Type as TRUNCATEADD. This will delete all the records in the corresponding table in Zoho Analytics and then add the data newly fetched records from the local database into Zoho Analytics.
  • In case you have modified few records in the tables in your local database being uploaded, then to get this modified in Zoho Analytics database set the Import Type as UPDATEADD and specify the matching columns. The Upload Tool will compare the records in the corresponding table in Zoho Analytics with the data being uploaded from your local database table based on the matching columns. If the record already exists in the Zoho Analytics table then it will be replaced with the new values available from the local database. If not, it will be added as new records in Zoho Analytics.

Viewing the Data Online

To view the data that you have uploaded:

  1. Login to https://analytics.zoho.com
  2. Click on the corresponding Workspace name under My Workspace.
  3. Click on the corresponding table at the Left to open the table and view the uploaded data.

FAQ and Troubleshooting Tips

Frequently Asked Questions

1. How to increase/decrease the batch size for upload?

To upload the data of larger size, Zoho Analytics Upload Tool splits the data and uploads as batches. The user can configure the number of rows to be sent for each batch by modifying the LINES_TO_SEND parameter in the common_params.conf file. By default this will be set to 5000.

Follow the steps given below to specify the batch size:

  • Open /conf/common_params.conf file.
  • Set the LINES_TO_SEND parameter to the number of lines you want to upload in each batch as given below.
LINES_TO_SEND=<number of rows to be uploaded in a batch>

Example 

LINES_TO_SEND=7000
  • Save the configuration file and start uploading the data. 

Note:

  • It is mandatory that the each batch size should not exceed 20 MB or 100,000 records.

2. Can I allow the shared user to upload the data into my table?

Yes, you can allow your shared users to upload data into your tables using Upload Tool. Follow the steps given below to do this.

Steps to be followed by you who is the Admin or Workspace Administrator

  • Login into Workspace Administrator's account.
  • Share the table, in which you want to allow your shared user to upload data, with Import Permission.
  • For more details, refer to this FAQ section.

Steps to be followed by the shared user:

  • Open /conf/common_params.conf file.
  • Add the DBOWNERNAME parameter and set this to Workspace Administrator's Zoho Analytics Email id or User name

Example:

 DBOWNERNAME=patriciab
  • Save the configuration file.
  • Start uploading the data using the shared user's Zoho Analytics credentials.

3. Can Upload tool be used to upload data from remotely hosted databases into Zoho Analytics?

Yes, you can upload data from remotely hosted databases (running in a remote machine) into Zoho Analytics using Upload Tool, provided a network connection via Java Database Connectivity (JDBC) Driver could be established between the remote database server and the machine in which you have installed the Upload Tool. 

The connection setting for the hosted database should be done in the same way as you do for a local database. For the "HOSTNAME" and "PORT" parameters, you need to specify the remote hosted database server IP address or host name and the corresponding port number. To know how to specify connection setting, refer to Specify Database Connection Settings section. 

4. Is it possible to fetch data from multiple databases using Upload Tool?

Yes, it is possible to setup the Zoho Analytics upload tool to fetch data from multiple database. By default, the Upload tool is designed to upload data from one database in an upload process.  However, we can modify the configuration files and executable bat or sh files to upload data from more than one database.

To accomplish the above set up, you need to create different set of configuration files required for each database.

For example, let's say you have two different databases in MySQL to be connected for data upload. We should have one set of configuration files for Database One and another set of configuration files for Database Two.

If you are working on Windows environment you can download the modified sample configuration and batch (UploadFromDB.bat) files from this download link. Similar changes are required for the UploadFromDB.sh shell script if you need to set up these configuration on a Linux/Mac environment.

Follow the below steps once you have downloaded the sample and extracted the same.

  • Conf files to be copied to Upload tool's config folder
    (i.e., "<UPLOAD_TOOL_HOME>/ZohoReports/Uploadtool/conf")
    • database_connections_params1.conf
    • database_sql_queries1.xml
    • database_connections_params2.conf
    • database_sql_queries2.xml
  • Executable "UploadFromDB.bat" file to be copied to Upload tool's bin folder 
    ( i.e., "<UPLOAD_TOOL_HOME>ZohoReports/UploadTool/bin"). The .bat file has been modified to upload the two different databases with the given two different configurations.

The following are the steps to configure two databases in the Upload tool:

Configuration for MYSQL Database One:

  • Provide the database connection properties in  
    "database_connections_params1.conf ",
  • Define the necessary SQL server's queries in " database_sql_queries1.xml ".

Configuration2 for the Database Two:

  • Provide the database connection params in 
    database_connections_params2.conf ",
  • Define the necessary SQL Server queries in " database_sql_queries2.xml ".

Once the above setup is done, execute the modified batch files copied to the bin directory.  This will fetch data from both the databases and upload that into Zoho Analytics.

5. Is it mandatory to follow similar column names in Zoho Analytics table as available in my local database?

No. You can follow a different column names in Zoho Analytics table from what is there in your local database and still upload data using Upload Tool. Follow the steps to upload the data from your local database into the corresponding column in Zoho Analytics table. 

  • Open <Tool_Home>/conf/database_sql_queries.xml.
  • Specify the query as given below to upload data from local database into the corresponding column in Zoho Analytics table. 
<Queries> <Query dbname="TestDB1" tablename="TestTable1" importtype="APPEND" matchingcols="" selectcols="" skiptop=""> SELECT column1-local-db AS column1-zohoreports-tablecolumn2-local-db AS column2-zohoreports-tablecolumn3-local-db AS column3-zohoreports-table FROM databasetable1
</Query> </Queries>

The above query will pull the data from column1-local-db, column2-local-db andcolumn2-local-db from your local database databasetable1 and import them into the column incolumn1-zohoreports-table, column1-zohoreports-table and column1-zohoreports-table in the Zoho Analytics table.

  • Save the file and start the upload process.

6. How to upload data with different date format from the in-house/local databases into Zoho Analytics?

Zoho Analytics Upload Tool expect the date format of the data being uploaded as dd/MM/yyyy HH:mm:ss. In case you are using a custom date format in your local database, then you need to convert your date value to the required format in your SQL SELECT Query using the DATE_FORMAT SQL function.

To convert the data format you need to set a date column to the DATE_FORMAT SQL function and specify the required default date format as given below. 

DATE_FORMAT(<date_field>, '%d/%m/%Y 00:00:00')

 In case your date column is set as string field, then you need to convert the data type using DATE function as given below. 

DATE_FORMAT(DATE(<date_field>), '%d/%m/%Y 00:00:00')

Troubleshooting Tips

Problem 1: I get "The host did not accept the connection within timeout of 15000 ms" while trying to upload data using Upload Tool. How to solve it?

Solution: This could be because of improper proxy server settings. If you are connecting the Internet through a proxy server, ensure that you have configured the correct proxy server details in the common_params.conf file. For more details, refer to the Setup: Common Settings and Proxy Configuration section.

Problem 2: I get an error message as ‘out of memory’ while trying to upload the file. How can I solve this error?

Solution: This could be because, the default memory allocated  to the Java Virtual Machine (which executes the Upload tool)  in your machine was not sufficient to import a large file using our Upload Too. Hence it throws the "Out of Memory". Follow the given steps to resolve this issue:

  • Open the file setEnv.bat / setEnv.sh available in the directory <Upload_Tool_Home>/bin
  • Add the property -Xmx1024 at the end of the variableJAVA_OPTS as below:
    Windows: set JAVA_OPTS=%JAVA_OPTS% -XX:NewSize=48M -Xmx1024M
    Linux/Mac: export JAVA_OPTS=$JAVA_OPTS -XX:NewSize=48M -Xmx1024M
  • Save the file and start the upload process.

Problem 3: I get "Error!!! Sorry, you cannot upload files that exceed 50MB in size" while uploading data using Upload Tool. How to overcome this?

Solution: To upload the data of larger size, Zoho Analytics Upload Tool splits the data and uploads them as batches. It is important that the batch size should not exceed 20 MB or 100,000 records per batch. You can increase or decrease the lines/records to send for each batch in common_params.conf file.

Follow the steps given below to set the lines to be uploaded in a batch:

  • Open <Tool_Home>/conf/common_params.conf file
  • Change the parameter LINES_TO_SEND=<no_of_lines>
  • Save the configuration file and start uploading the data

Problem 4: I get an error message as "Another import is in progress in this table started by the user 'User name' at 'time of import'." How to overcome this?

Solution: You will get the above message when more than one import process is running on the same table at the same time. Ensure that no other user is importing into the same table before initiating the progress.

Problem 5: I get a message as "ZOHO_MATCHING_COLUMNS is not present in the request parameters list". How to solve it?

Solution: You will get the above error message when you have set the import type to UPDATEADD and have not specified any value for the ZOHO_MATCHING_COLUMNS parameter. This parameter is mandatory when the ZOHO_IMPORT_TYPE is set to UPDATEADD. Set this parameter to column names based on which the existing records in the table need to be matched. If the record already exists in the table then it will be replaced with the new values in the uploaded CSV file.  Remaining rows will be added at the end of the table. 

Note:

  • It is recommended to set columns with unique values as matching column.

Problem 6: I get "Error!!! Column "Column_Name" is present in match columns but not in selected columns" while uploading data. How to overcome this?

Solution: You will get the above error when the column specified for the ZOHO_MATCHING_COLUMNS parameter is not available in the table. When you have set Import Type to UPDATEADD, it is mandatory to specify matching column. These columns will be used to check whether a record already exist in the table. Ensure that you have specified a valid column name for matching column before uploading data to avoid failure. 

Problem 7: I get an error message as "Maximum Concurrent User Tickets Limit Exceeded". How to solve this?

Solution: When you access the service a session will be created, which will be deactivated once you logout or after 7 days. If you did not logout or access the service from multiple locations then multiple active sessions will be created for your account. Zoho service restricts a user to have maximum of 20 active sessions. In case you have exceeded this limit, you will not be allowed the access the service. To overcome this you need to close the active sessions.  

Follow the steps below to close the current active sessions.

  • Login to http://accounts.zoho.com with your Zoho account credentials
  • Choose the Active Session under the Home tab.
  • Remove all active sessions by clicking the close all other sessions.

Problem 8: I get an error message as "You need to (re)login to perform this operation." How do I overcome this?

Solution: You will get the above message when Zoho Analytics encounters an authentication problem. This could happen in the following scenarios:

Ensure that your AuthToken is active. In case your Zoho Analytics account is hosted in our EU data center, then specify the Authentication parameters as below.

  • IAM_SERVER_URL= https://accounts.zoho.eu
  • REPORT_SERVER_URL= https://analyticsapi.zoho.eu

Problem 9: While uploading data from MySQL database, I got the error "Value '0000-00-00' cannot be represented as java.sql.Date". How can I overcome this error?

This error could occur when the table in your local MySQL database,  from which you are fetching data, contains a date column with 'NULL' value. When queried MySQL will return the value as '0000-00-00' by default. Since this is an invalid value, the MySQL JDBC driver will throw this error. 

You can overcome this error using the database_connections_params.conf file available in the ZohoReportsUdToolconf.  While configuring the local connection setting, in the DBNAMEspecify your database name followed by the ?zeroDateTimeBehavior=convertToNull& property. This will convert this invalid data and adds null value in the Zoho Analytics table. 

Example

DBNAME=Sales-Database?zeroDateTimeBehavior=convertToNull&.

Share this post : FacebookTwitter

Still can't find what you're looking for?

Write to us: support@zohoanalytics.com