Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: Tried to convert 'shape' to a tensor and failed. Error: None values not supported. #18753

Closed
Aksh-kumar opened this issue Nov 9, 2023 · 1 comment
Assignees
Labels

Comments

@Aksh-kumar
Copy link

Issue type
Bug

Have you reproduced the bug with TensorFlow Nightly?
Yes

Source
source

TensorFlow version
2.13.0

Custom code
Yes

OS platform and distribution
Window 10

Mobile device
No response

Python version
3.9.12

Bazel version
No response

GCC/compiler version
No response

CUDA/cuDNN version
No response

GPU model and memory
No response

Current behavior?

I am trying to implement this in tensorflow 2.13 but getting this error "ValueError: Tried to convert 'shape' to a tensor and failed. Error: None values not supported" after searching a lot I got to know we have to use -1 in case of None but even after trying that it was not working I am getting this error in custom Patches layer

patches = tf.reshape(patches, [batch_size, -1, patch_dims])

can anybody please help?

class Patches(layers.Layer):

    def __init__(self, patch_size, **kwargs):
        super(Patches, self).__init__()
        self.patch_size = patch_size

    def call(self, images):
        batch_size = tf.shape(images)[0] # Get the Batch Size
        patches = tf.image.extract_patches(
            images=images,
            sizes=[1, self.patch_size, self.patch_size, 1], # only along the Height and Width Dimension
            strides=[1, self.patch_size, self.patch_size, 1], # The next patch should not overlap the previus patch
            rates=[1,1,1,1],
            padding='VALID'
        )
        patch_dims = patches.shape[-1]
        patches = tf.reshape(patches, [batch_size, -1, patch_dims])
        
        return patches
    
    def get_config(self):
        config = super().get_config()
        config.update({
            "path-size": self.patch_size,
        })
        return config

Relevant log output

history = model.fit(
     27     train_generator,
     28     validation_data=valid_generator,
     29     batch_size=BATCH_SIZE,
     30     epochs=NUM_EPOCHS,
     31     callbacks=[
     32         checkpoint_callback, 
     33         tf.keras.callbacks.EarlyStopping(patience=5, monitor='val_Accuracy', mode='max' ,restore_best_weights=True)
     34     ],
     35 )
     37 model.load_weights(checkpoint_filepath)
     38 _, accuracy, top_5_accuracy = model.evaluate(x_test, y_test)

File ~\AppData\Roaming\Python\Python39\site-packages\keras\src\utils\traceback_utils.py:70, in filter_traceback.<locals>.error_handler(*args, **kwargs)
     67     filtered_tb = _process_traceback_frames(e.__traceback__)
     68     # To get the full stack trace, call:
     69     # `tf.debugging.disable_traceback_filtering()`
---> 70     raise e.with_traceback(filtered_tb) from None
     71 finally:
     72     del filtered_tb

File ~\AppData\Local\Temp\__autograph_generated_filelwsbm473.py:15, in outer_factory.<locals>.inner_factory.<locals>.tf__train_function(iterator)
     13 try:
     14     do_return = True
---> 15     retval_ = ag__.converted_call(ag__.ld(step_function), (ag__.ld(self), ag__.ld(iterator)), None, fscope)
     16 except:
     17     do_return = False

File ~\AppData\Local\Temp\__autograph_generated_filewl0d9m3r.py:13, in outer_factory.<locals>.inner_factory.<locals>.tf__call(self, images)
     11 patches = ag__.converted_call(ag__.ld(tf).image.extract_patches, (), dict(images=ag__.ld(images), sizes=[1, ag__.ld(self).patch_size, ag__.ld(self).patch_size, 1], strides=[1, ag__.ld(self).patch_size, ag__.ld(self).patch_size, 1], rates=[1, 1, 1, 1], padding='VALID'), fscope)
     12 patch_dims = ag__.ld(patches).shape[-1]
---> 13 patches = ag__.converted_call(ag__.ld(tf).reshape, (ag__.ld(patches), [ag__.ld(batch_size), -1, ag__.ld(patch_dims)]), None, fscope)
     14 try:
     15     do_return = True

ValueError: in user code:

    File "C:\Users\aksh1\AppData\Roaming\Python\Python39\site-packages\keras\src\engine\training.py", line 1338, in train_function  *
        return step_function(self, iterator)
    File "C:\Users\aksh1\AppData\Roaming\Python\Python39\site-packages\keras\src\engine\training.py", line 1322, in step_function  **
        outputs = model.distribute_strategy.run(run_step, args=(data,))
    File "C:\Users\aksh1\AppData\Roaming\Python\Python39\site-packages\keras\src\engine\training.py", line 1303, in run_step  **
        outputs = model.train_step(data)
    File "C:\Users\aksh1\AppData\Roaming\Python\Python39\site-packages\keras\src\engine\training.py", line 1080, in train_step
        y_pred = self(x, training=True)
    File "C:\Users\aksh1\AppData\Roaming\Python\Python39\site-packages\keras\src\utils\traceback_utils.py", line 70, in error_handler
        raise e.with_traceback(filtered_tb) from None
    File "C:\Users\aksh1\AppData\Local\Temp\__autograph_generated_filewl0d9m3r.py", line 13, in tf__call
        patches = ag__.converted_call(ag__.ld(tf).reshape, (ag__.ld(patches), [ag__.ld(batch_size), -1, ag__.ld(patch_dims)]), None, fscope)

    ValueError: Exception encountered when calling layer 'patches_3' (type Patches).
    
    in user code:
    
        File "C:\Users\aksh1\AppData\Local\Temp\ipykernel_11744\2749589268.py", line 17, in call  *
            patches = tf.reshape(patches, [batch_size, -1, patch_dims])
    
        ValueError: Tried to convert 'shape' to a tensor and failed. Error: None values not supported.
    
    
    Call arguments received by layer 'patches_3' (type Patches):
      • images=tf.Tensor(shape=(None, None, None, None), dtype=float32)

@fchollet
Copy link
Collaborator

You need to do patch_dims = tf.shape(patches)[-1] to get the runtime shape as opposed to the symbolic shape (None here)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants