KNOWLEDGE BASE

Error "Error from Hive: error code: '0' ... 'ExecuteStatement finished with operation state" When Connecting to Hadoop Table


Published: 04 Nov 2014
Last Modified Date: 09 Jul 2018

Issue

When connecting to Hadoop Hive Table, data does not load and the following error message might occur:
 
[Hortonworks][Hardy] (35) Error from server: error code: '0' error message: 'ExecuteStatement finished with operation state: ERROR_STATE'.​
 
Additionally, this error might occur when: 
  • You tried adding fields to a view that has a live connection to a Hadoop Hive data source.
  • Two tables in a Hadoop Hive database are joined and you attempt to update the data source in Tableau Desktop

Environment

  • Tableau Desktop
  • Hortonworks Hadoop Hive

Resolution

Option 1

Work with your Hadoop administrator to ensure that you have permissions to access the table.

Note: Simply granting permissions to a username may not be sufficient; a directory path may also need to be specified.
For more information, see Directories and Permissions​ in Hadoop's knowledge base.

Option 2

Ensure that the latest drivers are installed and configured correctly: Drivers & Activation.

Option 3

Work with Hortonworks support to ensure that the Hadoop user's home directory has been created correctly. Please note that the following error message might appear in the Hortonworks Hadoop log files:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=<>, access=WRITE, inode="/user":<>

Option 4

Ensure that the correct user with the required permissions is used with the ODBC driver and verify that the ODBC driver is configured correctly. If the issue persists, contact the Hadoop Hive distribution provider for assistance. 

Cause

An incorrect user is used with the ODBC driver, or the user does not have the required permissions, or the ODBC driver is not configured correctly.

Or, the Hadoop Distributed File System (HDFS) user's home directory may have been created in a way that Hadoop can not recognize. When MapReduce is triggered, a required .jar file cannot be written to the directory. 

Additional Information

More information and troubleshooting steps can be found in the following Tableau Community Forum and third party links:

Tableau Community Forum

Cloudera: ODBC Driver - Error when joining tables - The issue was resolved by in the ODBC properties, by setting hive.auto.convert.join to false. This action was performed because HiveServer2 failed to execute a MapReduce Local Task and it did not seem bound to queries submitted through ODBC.

Other

Important: Although we make every effort to ensure these links to external websites are accurate and relevant, Tableau cannot take responsibility for the accuracy or freshness of pages maintained by external providers. Contact the external site for answers to questions regarding its content.
Did this article resolve the issue?