org.apache.hadoop.fs.FileAlreadyExistsException: Failed to rename temp file

Hiran Amarathunga 65 Reputation points
2024-06-07T01:40:56.47+00:00

[Repeat Question due to old thread]

We have built a streaming pipeline with spark autoloader.

Source Folder is a azure blob container.

We've encountered a rare issue (could not replicate it). Below is the exception Message:

org.apache.hadoop.fs.FileAlreadyExistsException: Failed to rename temp file /checkpoint/sources/0/rocksdb/__tmp_path_dir/.220.zip.tmp to /checkpoint/sources/0/rocksdb/220.zip because file exists.

Please help on this if you're familiar with this, since this looks like some known platform issue.

Thank you

Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,409 questions
Azure Databricks
Azure Databricks
An Apache Spark-based analytics platform optimized for Azure.
2,045 questions
{count} votes

Accepted answer
  1. PRADEEPCHEEKATLA-MSFT 84,376 Reputation points Microsoft Employee
    2024-06-10T06:13:46.5933333+00:00

    @Hiran Amarathunga - Thanks for the question and using MS Q&A paltform.

    The error message you are seeing indicates that the file /checkpoint/sources/0/rocksdb/220.zip already exists and cannot be overwritten. This can happen if the file was not properly closed or deleted before the new file was written.

    To resolve this issue, you can try deleting the existing file /checkpoint/sources/0/rocksdb/220.zip and then rerun the pipeline. If you are unable to delete the file, you can try renaming it to a different name and then rerun the pipeline.

    If the issue persists, you can try checking the permissions on the file and ensure that the user running the pipeline has the necessary permissions to write to the file.

    Additionally, you can try checking if there are any other processes or jobs that may be accessing the file and preventing it from being overwritten.

    Hope this helps. Do let us know if you any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


0 additional answers

Sort by: Most helpful