-
Notifications
You must be signed in to change notification settings - Fork 253
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transform code snippet for Computer Vision problem set in the book not working (or I couldn't make it work) #25
Comments
Well as an update, I was able to move forward by adding the following line as the first to the proces_image func: raw_image = tf.squeeze(raw_image) I skipped the label processing for now to see how far I get but end up with yet another error:
|
I believe that is because |
@festeh, tried that but this gives the following error:
|
What the starting point for your transformation? a byte string? Do you load the byte string in your ExampleGen component? |
You can call the process_image function as follows:
Please note the Please close the issue if it solves your issue. Thank you! |
Hi all. Found the issue.
I've created a PR that should provide you with a working end-to-end example: |
@shabie were you able to run this part of the beam pipeline as defined under Chapter 5: Standalone Execution of TFT for the Computer Vision Problem?
I keep running into the following error:
|
The book provides code snippets for the computer vision problem set but it seems to be not working for the transform. I mean specifcally the following code:
I am using it as follows in the
preprocessing_fn
:This is being called in the
Transform
step of the pipeline:The TFRecordDataset is a two-feature dataset one containing the raw (JPEG) image and other one contains the label as string (stored also as bytes). It was generated using pretty much the same code shown earlier in the book under the Data Ingestion chapter.
When I run the above, I get the following traceback:
The text was updated successfully, but these errors were encountered: