Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

out of memory error #43

Open
matryer opened this issue Sep 2, 2015 · 2 comments
Open

out of memory error #43

matryer opened this issue Sep 2, 2015 · 2 comments

Comments

@matryer
Copy link

matryer commented Sep 2, 2015

Get runtime panic out of memory when I try to extract EXIF metadata from a large, empty file.

Given a file generated with this:

dd if=/dev/zero of=output.dat bs=5000000 count=1000

Obviously I don't expect this to work, but I don't expect it to consume all system resources either.

  • File is 5 GB
  • Running inside Ubuntu VM.
  • File generated on a Mac running 10.10.5 (14F27)
  • What other files might this also occur on? Is it the size of the file, or the fact that it's made of nothing?
@matryer
Copy link
Author

matryer commented Sep 2, 2015

I guess it's the file size? https://github.com/rwcarlsen/goexif/blob/go1/tiff/tiff.go#L34

Perhaps it should take an io.ReaderAt to make it clear that the content needs buffering (or at least to be seekable) rather that read as a stream?

@mholt
Copy link

mholt commented Jan 6, 2017

I just got bit by this (or something similar) too. I was hoping to use this package like crypto/hash works (h := sha.New() - then write to h a lot, then call h.Sum() to get the final result), but it blocks, I think, with this call to ioutil.ReadAll(). So I put it in a goroutine, but apparently this library only reads as much data as it needs to, then it stops reading. This is problematic when using an io.MultiWriter + io.Pipe, as I still need to stream the whole file to disk but this library doesn't read more than the first few (kilo) bytes, causing indefinite blocking. (Edit: I fixed this problem by wrapping the write end of the pipe in a "DishonestWriter" that reports a successful write even if the read end is closed, and then closing the read end of the pipe when EXIF decoding is done.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants