File length exception while uploding file to Gen2 lake
Bhuvnesh Kumar
26
Reputation points
I am using java sdk to copy file from S3 to Azur data lake Gen2.
S3Object fullObject = s3Client.getObject("bucket", "folder/file.txt");
S3ObjectInputStream strm = fullObject.getObjectContent();
DataLakeFileSystemClient gen2Client = gen2ServiceClient.getFileSystemClient("fileSystemName);
DataLakeFileClient fileClient = gen2Client.getFileClient("folder/file.txt");
fileClient.create(true);
InputStream bufferedIn = new BufferedInputStream(strm);
Response<Void> res = fileClient.appendWithResponse(bufferedIn, 0, fullObject.getObjectMetadata().getContentLength(),
null, null, null, Context.NONE);
if (res.getStatusCode() != 200) {
System.out.println("Failed for file: " +res);
}
PathInfo pathInfo = fileClient.flush(fullObject.getObjectMetadata().getContentLength(),true);
But this function throws error -
com.azure.core.exception.UnexpectedLengthException: Request body emitted 966650 bytes, more than the expected 966649 bytes
Even when I add +1 to the length I am passing (fullObject.getObjectMetadata().getContentLength()+1) , it throws exception -
com.azure.core.exception.UnexpectedLengthException: Request body emitted 966649 bytes, less than the expected 966650 bytes.
Sign in to answer