File length exception while uploding file to Gen2 lake

Bhuvnesh Kumar 26 Reputation points
2020-10-17T10:48:27.717+00:00

I am using java sdk to copy file from S3 to Azur data lake Gen2.

S3Object fullObject = s3Client.getObject("bucket", "folder/file.txt");
            S3ObjectInputStream strm = fullObject.getObjectContent();

            DataLakeFileSystemClient gen2Client = gen2ServiceClient.getFileSystemClient("fileSystemName);
            DataLakeFileClient fileClient = gen2Client.getFileClient("folder/file.txt");
            fileClient.create(true);
            InputStream bufferedIn = new BufferedInputStream(strm);
            Response<Void> res = fileClient.appendWithResponse(bufferedIn, 0, fullObject.getObjectMetadata().getContentLength(),
                    null, null, null, Context.NONE);
            if (res.getStatusCode() != 200) {
                System.out.println("Failed for file: " +res);
            }
            PathInfo pathInfo =  fileClient.flush(fullObject.getObjectMetadata().getContentLength(),true);

But this function throws error -

com.azure.core.exception.UnexpectedLengthException: Request body emitted 966650 bytes, more than the expected 966649 bytes

Even when I add +1 to the length I am passing (fullObject.getObjectMetadata().getContentLength()+1) , it throws exception -

com.azure.core.exception.UnexpectedLengthException: Request body emitted 966649 bytes, less than the expected 966650 bytes.
Azure Data Lake Storage
Azure Data Lake Storage
An Azure service that provides an enterprise-wide hyper-scale repository for big data analytic workloads and is integrated with Azure Blob Storage.
1,466 questions
{count} votes

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.