错误提示:



D:\Anaconda\envs\tensorflow\python.exe D:/PYCHARMprojects/Dailypractise/p25.py
2021-07-23 09:39:36.083143: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN)to use the following CPU instructions in performance-critical operations: AVX AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
WARNING:tensorflow:AutoGraph could not transform <bound method CustomVariationalLayer.call of <__main__.CustomVariationalLayer object at 0x000001D03D0C7670>> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: module 'gast' has no attribute 'Index'
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
Model: "functional_3"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 28, 28, 1)] 0
__________________________________________________________________________________________________
conv2d (Conv2D) (None, 28, 28, 32) 320 input_1[0][0]
__________________________________________________________________________________________________
conv2d_1 (Conv2D) (None, 14, 14, 64) 18496 conv2d[0][0]
__________________________________________________________________________________________________
conv2d_2 (Conv2D) (None, 14, 14, 64) 36928 conv2d_1[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, 14, 14, 64) 36928 conv2d_2[0][0]
__________________________________________________________________________________________________
flatten (Flatten) (None, 12544) 0 conv2d_3[0][0]
__________________________________________________________________________________________________
dense (Dense) (None, 32) 401440 flatten[0][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 2) 66 dense[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 2) 66 dense[0][0]
__________________________________________________________________________________________________
lambda (Lambda) (None, 2) 0 dense_1[0][0]
dense_2[0][0]
__________________________________________________________________________________________________
functional_1 (Functional) (None, 28, 28, 1) 56385 lambda[0][0]
__________________________________________________________________________________________________
custom_variational_layer (Custo (None, 28, 28, 1) 0 input_1[0][0]
functional_1[0][0]
==================================================================================================
Total params: 550,629
Trainable params: 550,629
Non-trainable params: 0
__________________________________________________________________________________________________
Epoch 1/10
Traceback (most recent call last):
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\eager\execute.py", line 59, in quick_execute
tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
TypeError: An op outside of the function building code is being passed
a "Graph" tensor. It is possible to have Graph tensors
leak out of the function building context by including a
tf.init_scope in your function building code.
For example, the following function will fail:
@tf.function
def has_init_scope():
my_constant = tf.constant(1.)
with tf.init_scope():
added = my_constant * 2
The graph tensor has name: dense_2/BiasAdd:0

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "D:/PYCHARMprojects/Dailypractise/p25.py", line 106, in <module>
vae.fit(x=x_train, y=None,
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\keras\engine\training.py", line 108, in _method_wrapper
return method(self, *args, **kwargs)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\keras\engine\training.py", line 1098, in fit
tmp_logs = train_function(iterator)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\eager\def_function.py", line 780, in __call__
result = self._call(*args, **kwds)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\eager\def_function.py", line 840, in _call
return self._stateless_fn(*args, **kwds)
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\eager\function.py", line 2829, in __call__
return graph_function._filtered_call(args, kwargs) # pylint: disable=protected-access
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\eager\function.py", line 1843, in _filtered_call
return self._call_flat(
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\eager\function.py", line 1923, in _call_flat
return self._build_call_outputs(self._inference_function.call(
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\eager\function.py", line 545, in call
outputs = execute.execute(
File "D:\Anaconda\envs\tensorflow\lib\site-packages\tensorflow\python\eager\execute.py", line 72, in quick_execute
raise core._SymbolicException(
tensorflow.python.eager.core._SymbolicException: Inputs to eager execution function cannot be Keras symbolic tensors, but found [<tf.Tensor 'dense_2/BiasAdd:0' shape=(None, 2) dtype=float32>, <tf.Tensor 'dense_1/BiasAdd:0' shape=(None, 2) dtype=float32>]

Process finished with exit code 1


修正:

导入以下内容:



import tensorflow as tf
tf.compat.v1.disable_eager_execution()