Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: A KerasTensor is symbolic: it's a placeholder for a shape an a dtype. It doesn't have any actual numerical value. You cannot convert it to a NumPy array. #1401

Open
accioharshita opened this issue Mar 26, 2024 · 7 comments

Comments

@accioharshita
Copy link

Hey, so I've downloaded the preprocessing & encoder layer of BERT in order to build a simple email classification model. When I'm finally building my model to pass the training data it throws this error. Can someone tell me what's wrong?

Screenshot 2024-03-27 021032
Screenshot 2024-03-27 021007

@LeviBen
Copy link

LeviBen commented Mar 31, 2024

I have the same problem. With the URL was working fine, but with the model working locally, for some reason it crashes

@SDSCodes22
Copy link

@accioharshita Hi, I was having the same problem. In fact, I was using the exact same code as you. I managed to solve my problem by importing Bert through the keras_nlp library. Here is the code I ended up with:

text_input = tf.keras.layers.Input(shape=(), dtype=tf.string)
preprocessor = keras_nlp.models.BertPreprocessor.from_preset("bert_base_en_uncased",trainable=True)
encoder_inputs = preprocessor(text_input)
encoder = keras_nlp.models.BertBackbone.from_preset("bert_base_en_uncased")
outputs = encoder(encoder_inputs)
pooled_output = outputs["pooled_output"]      # [batch_size, 768].
sequence_output = outputs["sequence_output"]  # [batch_size, seq_length, 768].

@BismaAyaz
Copy link

BismaAyaz commented Apr 17, 2024

@SoumyaCodes2020 can you please let me know how you saved model with this approach. I'm using this approach model3.save("model3.keras")
model3 = keras.models.load_model("model3.keras") but getting error

No vocabulary has been set for WordPieceTokenizer. Make sure to pass a `vocabulary` argument when creating the layer.

@leinonk1
Copy link

leinonk1 commented Jun 1, 2024

My problem disappeared when I installed tensorflow 2.15.1 and tensorflow-text 2.15.0, instead of the newest 2.16.0. tensorflow-hub version was 0.16.1. (my comment below does not apply to this solution)

@SoumyaCodes2020 how many trainable params do you have in the model if you use that approach (when you call model.summary())? It seems that this approach leads to the Bert layer parameters being trainable (even if trainable=False). I had to solve it by looping through all layers in the encoder and setting the layers as non-trainable:
for layer in encoder.layers: layer.trainable = False

@arc2226
Copy link

arc2226 commented Jun 11, 2024

@accioharshita Hi, I was having the same problem. In fact, I was using the exact same code as you. I managed to solve my problem by importing Bert through the keras_nlp library. Here is the code I ended up with:

text_input = tf.keras.layers.Input(shape=(), dtype=tf.string)
preprocessor = keras_nlp.models.BertPreprocessor.from_preset("bert_base_en_uncased",trainable=True)
encoder_inputs = preprocessor(text_input)
encoder = keras_nlp.models.BertBackbone.from_preset("bert_base_en_uncased")
outputs = encoder(encoder_inputs)
pooled_output = outputs["pooled_output"]      # [batch_size, 768].
sequence_output = outputs["sequence_output"]  # [batch_size, seq_length, 768].

This worked for me too.

@AlbertoMQ
Copy link

This is not an appropriate solution to the problem.

mojc added a commit to mojc/essay_scoring that referenced this issue Sep 12, 2024
@dnydlk
Copy link

dnydlk commented Dec 8, 2024

I found solution from this issue from tensorflow_hub repo issues

import os
os.environ['TF_USE_LEGACY_KERAS']='1'
import tensorflow as tf
import tensorflow_hub as hub
import tensorflow_text as text

I put the first block of code before I run all the imports and it worked

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants