Disconnected graph when concatenating two modelsHow to merge two dictionaries in a single expression?How do I concatenate two lists in Python?How to set the input of a Keras layer of a functional model, with a Tensorflow tensor?How to reuse slim.arg_scope in tensorFlow?Kerras the definition of a model changes when the input tensor of the model is the output of another modelImplementing a tensorflow graph into a Keras modelKeras Value Error in Custom Loss FunctionInput tensors to a Model must come from `tf.layers.Input` when I concatenate two models with Keras API on TensorflowError while defining my Keras model: “ AttributeError: 'NoneType' object has no attribute '_inbound_nodes' ”Why replacing max pool by average pool using Keras APIs fails?

Draw simple lines in Inkscape

I probably found a bug with the sudo apt install function

Download, install and reboot computer at night if needed

What makes Graph invariants so useful/important?

Why are only specific transaction types accepted into the mempool?

How do we improve the relationship with a client software team that performs poorly and is becoming less collaborative?

Can I make popcorn with any corn?

Motorized valve interfering with button?

How does one intimidate enemies without having the capacity for violence?

How can bays and straits be determined in a procedurally generated map?

What are these boxed doors outside store fronts in New York?

Why CLRS example on residual networks does not follows its formula?

Is it possible to do 50 km distance without any previous training?

TGV timetables / schedules?

"You are your self first supporter", a more proper way to say it

Set-theoretical foundations of Mathematics with only bounded quantifiers

Should I join office cleaning event for free?

What do you call a Matrix-like slowdown and camera movement effect?

Is there really no realistic way for a skeleton monster to move around without magic?

Validation accuracy vs Testing accuracy

How is it possible to have an ability score that is less than 3?

XeLaTeX and pdfLaTeX ignore hyphenation

Why is "Reports" in sentence down without "The"

How can the DM most effectively choose 1 out of an odd number of players to be targeted by an attack or effect?



Disconnected graph when concatenating two models


How to merge two dictionaries in a single expression?How do I concatenate two lists in Python?How to set the input of a Keras layer of a functional model, with a Tensorflow tensor?How to reuse slim.arg_scope in tensorFlow?Kerras the definition of a model changes when the input tensor of the model is the output of another modelImplementing a tensorflow graph into a Keras modelKeras Value Error in Custom Loss FunctionInput tensors to a Model must come from `tf.layers.Input` when I concatenate two models with Keras API on TensorflowError while defining my Keras model: “ AttributeError: 'NoneType' object has no attribute '_inbound_nodes' ”Why replacing max pool by average pool using Keras APIs fails?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








0















Trying to build a new model based on part of pre-trained model,



Here's some cleaned out code.



Let's imagine we got model1 trained, and want to add some layers defined in model2:



import tensorflow.keras as keras
import tensorflow as tf
from tensorflow.keras.layers import Input, Conv2D, Activation
from tensorflow.keras.models import Model, Sequential

model1 = Sequential([
Conv2D(2, (3,3), padding='same', input_shape=(6,6,1)),
Activation('relu')
])
model2 = Sequential([
Conv2D(3, (3,3), padding='same', input_shape=(6,6,2)),
Activation('softmax')
])

model_merge = Model(inputs=model1.input,
outputs=Activation('softmax')(model2(model1.get_layer('conv2d').output)))


It looks a bit messy, but I want to demonstrate it's not disconnected by adding a softmax activation here.



summary of model1:



_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 6, 6, 2) 20
_________________________________________________________________
activation (Activation) (None, 6, 6, 2) 0
=================================================================
Total params: 20
Trainable params: 20
Non-trainable params: 0
_________________________________________________________________


summary of model2:



_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_4 (Conv2D) (None, 6, 6, 3) 57
_________________________________________________________________
activation_4 (Activation) (None, 6, 6, 3) 0
=================================================================
Total params: 57
Trainable params: 57
Non-trainable params: 0
_________________________


And summary of model_merge:



_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_input (InputLayer) (None, 6, 6, 1) 0
_________________________________________________________________
conv2d (Conv2D) (None, 6, 6, 2) 20
_________________________________________________________________
sequential_2 (Sequential) (None, 6, 6, 3) 57
_________________________________________________________________
activation_4 (Activation) (None, 6, 6, 3) 0
=================================================================
Total params: 77
Trainable params: 77
Non-trainable params: 0
_________________________________________________________________


Let's prove this merged model is not disconnected:



layers = [layer.output for layer in model_merge.layers]
test1 = Model(inputs=model_merge.input, outputs=layers[-1])


Everything works just fine.



test1's summary:



_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d_input (InputLayer) (None, 6, 6, 1) 0
_________________________________________________________________
conv2d (Conv2D) (None, 6, 6, 2) 20
_________________________________________________________________
sequential_2 (Sequential) (None, 6, 6, 3) 57
_________________________________________________________________
activation_4 (Activation) (None, 6, 6, 3) 0
=================================================================
Total params: 77
Trainable params: 77
Non-trainable params: 0
_________________________________________________________________


Here's the tragedy:



test2 = Model(inputs=model_merge.input, outputs=layers[-2])


The most important feed back:



ValueError: Graph disconnected: cannot obtain value for tensor Tensor("conv2d_2_input:0", shape=(?, 6, 6, 2), dtype=float32) at layer "conv2d_2_input". The following previous layers were accessed without issue: []


full feedback:



ValueErrorTraceback (most recent call last)
<ipython-input-18-946b325081c1> in <module>
----> 1 test = Model(inputs=model_merge.input, outputs=layers[-2])

/usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/training.py in __init__(self, *args, **kwargs)
119
120 def __init__(self, *args, **kwargs):
--> 121 super(Model, self).__init__(*args, **kwargs)
122 # Create a cache for iterator get_next op.
123 self._iterator_get_next = weakref.WeakKeyDictionary()

/usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in __init__(self, *args, **kwargs)
79 'inputs' in kwargs and 'outputs' in kwargs):
80 # Graph network
---> 81 self._init_graph_network(*args, **kwargs)
82 else:
83 # Subclassed network

/usr/local/lib/python3.5/dist-packages/tensorflow/python/training/checkpointable/base.py in _method_wrapper(self, *args, **kwargs)
440 self._setattr_tracking = False # pylint: disable=protected-access
441 try:
--> 442 method(self, *args, **kwargs)
443 finally:
444 self._setattr_tracking = previous_value # pylint: disable=protected-access

/usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in _init_graph_network(self, inputs, outputs, name)
219 # Keep track of the network's nodes and layers.
220 nodes, nodes_by_depth, layers, layers_by_depth = _map_graph_network(
--> 221 self.inputs, self.outputs)
222 self._network_nodes = nodes
223 self._nodes_by_depth = nodes_by_depth

/usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in _map_graph_network(inputs, outputs)
1850 'The following previous layers '
1851 'were accessed without issue: ' +
-> 1852 str(layers_with_complete_input))
1853 for x in node.output_tensors:
1854 computable_tensors.append(x)

ValueError: Graph disconnected: cannot obtain value for tensor Tensor("conv2d_2_input:0", shape=(?, 6, 6, 2), dtype=float32) at layer "conv2d_2_input". The following previous layers were accessed without issue: []


It's really driving me crazy,



Any ideas?










share|improve this question






























    0















    Trying to build a new model based on part of pre-trained model,



    Here's some cleaned out code.



    Let's imagine we got model1 trained, and want to add some layers defined in model2:



    import tensorflow.keras as keras
    import tensorflow as tf
    from tensorflow.keras.layers import Input, Conv2D, Activation
    from tensorflow.keras.models import Model, Sequential

    model1 = Sequential([
    Conv2D(2, (3,3), padding='same', input_shape=(6,6,1)),
    Activation('relu')
    ])
    model2 = Sequential([
    Conv2D(3, (3,3), padding='same', input_shape=(6,6,2)),
    Activation('softmax')
    ])

    model_merge = Model(inputs=model1.input,
    outputs=Activation('softmax')(model2(model1.get_layer('conv2d').output)))


    It looks a bit messy, but I want to demonstrate it's not disconnected by adding a softmax activation here.



    summary of model1:



    _________________________________________________________________
    Layer (type) Output Shape Param #
    =================================================================
    conv2d (Conv2D) (None, 6, 6, 2) 20
    _________________________________________________________________
    activation (Activation) (None, 6, 6, 2) 0
    =================================================================
    Total params: 20
    Trainable params: 20
    Non-trainable params: 0
    _________________________________________________________________


    summary of model2:



    _________________________________________________________________
    Layer (type) Output Shape Param #
    =================================================================
    conv2d_4 (Conv2D) (None, 6, 6, 3) 57
    _________________________________________________________________
    activation_4 (Activation) (None, 6, 6, 3) 0
    =================================================================
    Total params: 57
    Trainable params: 57
    Non-trainable params: 0
    _________________________


    And summary of model_merge:



    _________________________________________________________________
    Layer (type) Output Shape Param #
    =================================================================
    conv2d_input (InputLayer) (None, 6, 6, 1) 0
    _________________________________________________________________
    conv2d (Conv2D) (None, 6, 6, 2) 20
    _________________________________________________________________
    sequential_2 (Sequential) (None, 6, 6, 3) 57
    _________________________________________________________________
    activation_4 (Activation) (None, 6, 6, 3) 0
    =================================================================
    Total params: 77
    Trainable params: 77
    Non-trainable params: 0
    _________________________________________________________________


    Let's prove this merged model is not disconnected:



    layers = [layer.output for layer in model_merge.layers]
    test1 = Model(inputs=model_merge.input, outputs=layers[-1])


    Everything works just fine.



    test1's summary:



    _________________________________________________________________
    Layer (type) Output Shape Param #
    =================================================================
    conv2d_input (InputLayer) (None, 6, 6, 1) 0
    _________________________________________________________________
    conv2d (Conv2D) (None, 6, 6, 2) 20
    _________________________________________________________________
    sequential_2 (Sequential) (None, 6, 6, 3) 57
    _________________________________________________________________
    activation_4 (Activation) (None, 6, 6, 3) 0
    =================================================================
    Total params: 77
    Trainable params: 77
    Non-trainable params: 0
    _________________________________________________________________


    Here's the tragedy:



    test2 = Model(inputs=model_merge.input, outputs=layers[-2])


    The most important feed back:



    ValueError: Graph disconnected: cannot obtain value for tensor Tensor("conv2d_2_input:0", shape=(?, 6, 6, 2), dtype=float32) at layer "conv2d_2_input". The following previous layers were accessed without issue: []


    full feedback:



    ValueErrorTraceback (most recent call last)
    <ipython-input-18-946b325081c1> in <module>
    ----> 1 test = Model(inputs=model_merge.input, outputs=layers[-2])

    /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/training.py in __init__(self, *args, **kwargs)
    119
    120 def __init__(self, *args, **kwargs):
    --> 121 super(Model, self).__init__(*args, **kwargs)
    122 # Create a cache for iterator get_next op.
    123 self._iterator_get_next = weakref.WeakKeyDictionary()

    /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in __init__(self, *args, **kwargs)
    79 'inputs' in kwargs and 'outputs' in kwargs):
    80 # Graph network
    ---> 81 self._init_graph_network(*args, **kwargs)
    82 else:
    83 # Subclassed network

    /usr/local/lib/python3.5/dist-packages/tensorflow/python/training/checkpointable/base.py in _method_wrapper(self, *args, **kwargs)
    440 self._setattr_tracking = False # pylint: disable=protected-access
    441 try:
    --> 442 method(self, *args, **kwargs)
    443 finally:
    444 self._setattr_tracking = previous_value # pylint: disable=protected-access

    /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in _init_graph_network(self, inputs, outputs, name)
    219 # Keep track of the network's nodes and layers.
    220 nodes, nodes_by_depth, layers, layers_by_depth = _map_graph_network(
    --> 221 self.inputs, self.outputs)
    222 self._network_nodes = nodes
    223 self._nodes_by_depth = nodes_by_depth

    /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in _map_graph_network(inputs, outputs)
    1850 'The following previous layers '
    1851 'were accessed without issue: ' +
    -> 1852 str(layers_with_complete_input))
    1853 for x in node.output_tensors:
    1854 computable_tensors.append(x)

    ValueError: Graph disconnected: cannot obtain value for tensor Tensor("conv2d_2_input:0", shape=(?, 6, 6, 2), dtype=float32) at layer "conv2d_2_input". The following previous layers were accessed without issue: []


    It's really driving me crazy,



    Any ideas?










    share|improve this question


























      0












      0








      0








      Trying to build a new model based on part of pre-trained model,



      Here's some cleaned out code.



      Let's imagine we got model1 trained, and want to add some layers defined in model2:



      import tensorflow.keras as keras
      import tensorflow as tf
      from tensorflow.keras.layers import Input, Conv2D, Activation
      from tensorflow.keras.models import Model, Sequential

      model1 = Sequential([
      Conv2D(2, (3,3), padding='same', input_shape=(6,6,1)),
      Activation('relu')
      ])
      model2 = Sequential([
      Conv2D(3, (3,3), padding='same', input_shape=(6,6,2)),
      Activation('softmax')
      ])

      model_merge = Model(inputs=model1.input,
      outputs=Activation('softmax')(model2(model1.get_layer('conv2d').output)))


      It looks a bit messy, but I want to demonstrate it's not disconnected by adding a softmax activation here.



      summary of model1:



      _________________________________________________________________
      Layer (type) Output Shape Param #
      =================================================================
      conv2d (Conv2D) (None, 6, 6, 2) 20
      _________________________________________________________________
      activation (Activation) (None, 6, 6, 2) 0
      =================================================================
      Total params: 20
      Trainable params: 20
      Non-trainable params: 0
      _________________________________________________________________


      summary of model2:



      _________________________________________________________________
      Layer (type) Output Shape Param #
      =================================================================
      conv2d_4 (Conv2D) (None, 6, 6, 3) 57
      _________________________________________________________________
      activation_4 (Activation) (None, 6, 6, 3) 0
      =================================================================
      Total params: 57
      Trainable params: 57
      Non-trainable params: 0
      _________________________


      And summary of model_merge:



      _________________________________________________________________
      Layer (type) Output Shape Param #
      =================================================================
      conv2d_input (InputLayer) (None, 6, 6, 1) 0
      _________________________________________________________________
      conv2d (Conv2D) (None, 6, 6, 2) 20
      _________________________________________________________________
      sequential_2 (Sequential) (None, 6, 6, 3) 57
      _________________________________________________________________
      activation_4 (Activation) (None, 6, 6, 3) 0
      =================================================================
      Total params: 77
      Trainable params: 77
      Non-trainable params: 0
      _________________________________________________________________


      Let's prove this merged model is not disconnected:



      layers = [layer.output for layer in model_merge.layers]
      test1 = Model(inputs=model_merge.input, outputs=layers[-1])


      Everything works just fine.



      test1's summary:



      _________________________________________________________________
      Layer (type) Output Shape Param #
      =================================================================
      conv2d_input (InputLayer) (None, 6, 6, 1) 0
      _________________________________________________________________
      conv2d (Conv2D) (None, 6, 6, 2) 20
      _________________________________________________________________
      sequential_2 (Sequential) (None, 6, 6, 3) 57
      _________________________________________________________________
      activation_4 (Activation) (None, 6, 6, 3) 0
      =================================================================
      Total params: 77
      Trainable params: 77
      Non-trainable params: 0
      _________________________________________________________________


      Here's the tragedy:



      test2 = Model(inputs=model_merge.input, outputs=layers[-2])


      The most important feed back:



      ValueError: Graph disconnected: cannot obtain value for tensor Tensor("conv2d_2_input:0", shape=(?, 6, 6, 2), dtype=float32) at layer "conv2d_2_input". The following previous layers were accessed without issue: []


      full feedback:



      ValueErrorTraceback (most recent call last)
      <ipython-input-18-946b325081c1> in <module>
      ----> 1 test = Model(inputs=model_merge.input, outputs=layers[-2])

      /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/training.py in __init__(self, *args, **kwargs)
      119
      120 def __init__(self, *args, **kwargs):
      --> 121 super(Model, self).__init__(*args, **kwargs)
      122 # Create a cache for iterator get_next op.
      123 self._iterator_get_next = weakref.WeakKeyDictionary()

      /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in __init__(self, *args, **kwargs)
      79 'inputs' in kwargs and 'outputs' in kwargs):
      80 # Graph network
      ---> 81 self._init_graph_network(*args, **kwargs)
      82 else:
      83 # Subclassed network

      /usr/local/lib/python3.5/dist-packages/tensorflow/python/training/checkpointable/base.py in _method_wrapper(self, *args, **kwargs)
      440 self._setattr_tracking = False # pylint: disable=protected-access
      441 try:
      --> 442 method(self, *args, **kwargs)
      443 finally:
      444 self._setattr_tracking = previous_value # pylint: disable=protected-access

      /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in _init_graph_network(self, inputs, outputs, name)
      219 # Keep track of the network's nodes and layers.
      220 nodes, nodes_by_depth, layers, layers_by_depth = _map_graph_network(
      --> 221 self.inputs, self.outputs)
      222 self._network_nodes = nodes
      223 self._nodes_by_depth = nodes_by_depth

      /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in _map_graph_network(inputs, outputs)
      1850 'The following previous layers '
      1851 'were accessed without issue: ' +
      -> 1852 str(layers_with_complete_input))
      1853 for x in node.output_tensors:
      1854 computable_tensors.append(x)

      ValueError: Graph disconnected: cannot obtain value for tensor Tensor("conv2d_2_input:0", shape=(?, 6, 6, 2), dtype=float32) at layer "conv2d_2_input". The following previous layers were accessed without issue: []


      It's really driving me crazy,



      Any ideas?










      share|improve this question
















      Trying to build a new model based on part of pre-trained model,



      Here's some cleaned out code.



      Let's imagine we got model1 trained, and want to add some layers defined in model2:



      import tensorflow.keras as keras
      import tensorflow as tf
      from tensorflow.keras.layers import Input, Conv2D, Activation
      from tensorflow.keras.models import Model, Sequential

      model1 = Sequential([
      Conv2D(2, (3,3), padding='same', input_shape=(6,6,1)),
      Activation('relu')
      ])
      model2 = Sequential([
      Conv2D(3, (3,3), padding='same', input_shape=(6,6,2)),
      Activation('softmax')
      ])

      model_merge = Model(inputs=model1.input,
      outputs=Activation('softmax')(model2(model1.get_layer('conv2d').output)))


      It looks a bit messy, but I want to demonstrate it's not disconnected by adding a softmax activation here.



      summary of model1:



      _________________________________________________________________
      Layer (type) Output Shape Param #
      =================================================================
      conv2d (Conv2D) (None, 6, 6, 2) 20
      _________________________________________________________________
      activation (Activation) (None, 6, 6, 2) 0
      =================================================================
      Total params: 20
      Trainable params: 20
      Non-trainable params: 0
      _________________________________________________________________


      summary of model2:



      _________________________________________________________________
      Layer (type) Output Shape Param #
      =================================================================
      conv2d_4 (Conv2D) (None, 6, 6, 3) 57
      _________________________________________________________________
      activation_4 (Activation) (None, 6, 6, 3) 0
      =================================================================
      Total params: 57
      Trainable params: 57
      Non-trainable params: 0
      _________________________


      And summary of model_merge:



      _________________________________________________________________
      Layer (type) Output Shape Param #
      =================================================================
      conv2d_input (InputLayer) (None, 6, 6, 1) 0
      _________________________________________________________________
      conv2d (Conv2D) (None, 6, 6, 2) 20
      _________________________________________________________________
      sequential_2 (Sequential) (None, 6, 6, 3) 57
      _________________________________________________________________
      activation_4 (Activation) (None, 6, 6, 3) 0
      =================================================================
      Total params: 77
      Trainable params: 77
      Non-trainable params: 0
      _________________________________________________________________


      Let's prove this merged model is not disconnected:



      layers = [layer.output for layer in model_merge.layers]
      test1 = Model(inputs=model_merge.input, outputs=layers[-1])


      Everything works just fine.



      test1's summary:



      _________________________________________________________________
      Layer (type) Output Shape Param #
      =================================================================
      conv2d_input (InputLayer) (None, 6, 6, 1) 0
      _________________________________________________________________
      conv2d (Conv2D) (None, 6, 6, 2) 20
      _________________________________________________________________
      sequential_2 (Sequential) (None, 6, 6, 3) 57
      _________________________________________________________________
      activation_4 (Activation) (None, 6, 6, 3) 0
      =================================================================
      Total params: 77
      Trainable params: 77
      Non-trainable params: 0
      _________________________________________________________________


      Here's the tragedy:



      test2 = Model(inputs=model_merge.input, outputs=layers[-2])


      The most important feed back:



      ValueError: Graph disconnected: cannot obtain value for tensor Tensor("conv2d_2_input:0", shape=(?, 6, 6, 2), dtype=float32) at layer "conv2d_2_input". The following previous layers were accessed without issue: []


      full feedback:



      ValueErrorTraceback (most recent call last)
      <ipython-input-18-946b325081c1> in <module>
      ----> 1 test = Model(inputs=model_merge.input, outputs=layers[-2])

      /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/training.py in __init__(self, *args, **kwargs)
      119
      120 def __init__(self, *args, **kwargs):
      --> 121 super(Model, self).__init__(*args, **kwargs)
      122 # Create a cache for iterator get_next op.
      123 self._iterator_get_next = weakref.WeakKeyDictionary()

      /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in __init__(self, *args, **kwargs)
      79 'inputs' in kwargs and 'outputs' in kwargs):
      80 # Graph network
      ---> 81 self._init_graph_network(*args, **kwargs)
      82 else:
      83 # Subclassed network

      /usr/local/lib/python3.5/dist-packages/tensorflow/python/training/checkpointable/base.py in _method_wrapper(self, *args, **kwargs)
      440 self._setattr_tracking = False # pylint: disable=protected-access
      441 try:
      --> 442 method(self, *args, **kwargs)
      443 finally:
      444 self._setattr_tracking = previous_value # pylint: disable=protected-access

      /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in _init_graph_network(self, inputs, outputs, name)
      219 # Keep track of the network's nodes and layers.
      220 nodes, nodes_by_depth, layers, layers_by_depth = _map_graph_network(
      --> 221 self.inputs, self.outputs)
      222 self._network_nodes = nodes
      223 self._nodes_by_depth = nodes_by_depth

      /usr/local/lib/python3.5/dist-packages/tensorflow/python/keras/engine/network.py in _map_graph_network(inputs, outputs)
      1850 'The following previous layers '
      1851 'were accessed without issue: ' +
      -> 1852 str(layers_with_complete_input))
      1853 for x in node.output_tensors:
      1854 computable_tensors.append(x)

      ValueError: Graph disconnected: cannot obtain value for tensor Tensor("conv2d_2_input:0", shape=(?, 6, 6, 2), dtype=float32) at layer "conv2d_2_input". The following previous layers were accessed without issue: []


      It's really driving me crazy,



      Any ideas?







      python tensorflow keras disconnected






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Mar 9 at 4:42







      haojie yuan

















      asked Mar 9 at 3:45









      haojie yuanhaojie yuan

      32




      32






















          1 Answer
          1






          active

          oldest

          votes


















          0














          The layer you are trying to use as output has two output nodes. The first connects the input of model2 to the output of the model2. The second output node connects the output of the model1 and the first layer of model2. By default, the layer output returns only the first output node. So what is happening is you are tying to connect the input of the model_merge(input of model1) with the first output node.



          To following code shows this. Individual output nodes of the layer can be accessed using the get_output_at() method of the layer.



          layer_output = model_merge.layers[-2].output # The first output node
          layer_output_1 = model_merge.layers[-2].get_output_at(0) # The first output node
          layer_output_2 = model_merge.layers[-2].get_output_at(1) # The second output node


          Now the following two codes throws error because the graph is disconnected.



          test2 = Model(inputs=model_merge.input, outputs=layer_output)


          and



          test2 = Model(inputs=model_merge.input, outputs=layer_output_1)


          But the code below doesnot throw error, because the graph is connected.



          test2 = Model(inputs=model_merge.input, outputs=layer_output_2)





          share|improve this answer























          • Thanks a lot. So what's happening is: I defined this layer first time in model 2 so it generated the first output node connecting the input and output of model2 , then I created model_merge so it generated second output node connecting the output of model 1 and the output of model2, right?

            – haojie yuan
            Mar 9 at 8:02












          • Yes, that is exactly what is happening.

            – Mitiku
            Mar 9 at 8:46











          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55073785%2fdisconnected-graph-when-concatenating-two-models%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0














          The layer you are trying to use as output has two output nodes. The first connects the input of model2 to the output of the model2. The second output node connects the output of the model1 and the first layer of model2. By default, the layer output returns only the first output node. So what is happening is you are tying to connect the input of the model_merge(input of model1) with the first output node.



          To following code shows this. Individual output nodes of the layer can be accessed using the get_output_at() method of the layer.



          layer_output = model_merge.layers[-2].output # The first output node
          layer_output_1 = model_merge.layers[-2].get_output_at(0) # The first output node
          layer_output_2 = model_merge.layers[-2].get_output_at(1) # The second output node


          Now the following two codes throws error because the graph is disconnected.



          test2 = Model(inputs=model_merge.input, outputs=layer_output)


          and



          test2 = Model(inputs=model_merge.input, outputs=layer_output_1)


          But the code below doesnot throw error, because the graph is connected.



          test2 = Model(inputs=model_merge.input, outputs=layer_output_2)





          share|improve this answer























          • Thanks a lot. So what's happening is: I defined this layer first time in model 2 so it generated the first output node connecting the input and output of model2 , then I created model_merge so it generated second output node connecting the output of model 1 and the output of model2, right?

            – haojie yuan
            Mar 9 at 8:02












          • Yes, that is exactly what is happening.

            – Mitiku
            Mar 9 at 8:46















          0














          The layer you are trying to use as output has two output nodes. The first connects the input of model2 to the output of the model2. The second output node connects the output of the model1 and the first layer of model2. By default, the layer output returns only the first output node. So what is happening is you are tying to connect the input of the model_merge(input of model1) with the first output node.



          To following code shows this. Individual output nodes of the layer can be accessed using the get_output_at() method of the layer.



          layer_output = model_merge.layers[-2].output # The first output node
          layer_output_1 = model_merge.layers[-2].get_output_at(0) # The first output node
          layer_output_2 = model_merge.layers[-2].get_output_at(1) # The second output node


          Now the following two codes throws error because the graph is disconnected.



          test2 = Model(inputs=model_merge.input, outputs=layer_output)


          and



          test2 = Model(inputs=model_merge.input, outputs=layer_output_1)


          But the code below doesnot throw error, because the graph is connected.



          test2 = Model(inputs=model_merge.input, outputs=layer_output_2)





          share|improve this answer























          • Thanks a lot. So what's happening is: I defined this layer first time in model 2 so it generated the first output node connecting the input and output of model2 , then I created model_merge so it generated second output node connecting the output of model 1 and the output of model2, right?

            – haojie yuan
            Mar 9 at 8:02












          • Yes, that is exactly what is happening.

            – Mitiku
            Mar 9 at 8:46













          0












          0








          0







          The layer you are trying to use as output has two output nodes. The first connects the input of model2 to the output of the model2. The second output node connects the output of the model1 and the first layer of model2. By default, the layer output returns only the first output node. So what is happening is you are tying to connect the input of the model_merge(input of model1) with the first output node.



          To following code shows this. Individual output nodes of the layer can be accessed using the get_output_at() method of the layer.



          layer_output = model_merge.layers[-2].output # The first output node
          layer_output_1 = model_merge.layers[-2].get_output_at(0) # The first output node
          layer_output_2 = model_merge.layers[-2].get_output_at(1) # The second output node


          Now the following two codes throws error because the graph is disconnected.



          test2 = Model(inputs=model_merge.input, outputs=layer_output)


          and



          test2 = Model(inputs=model_merge.input, outputs=layer_output_1)


          But the code below doesnot throw error, because the graph is connected.



          test2 = Model(inputs=model_merge.input, outputs=layer_output_2)





          share|improve this answer













          The layer you are trying to use as output has two output nodes. The first connects the input of model2 to the output of the model2. The second output node connects the output of the model1 and the first layer of model2. By default, the layer output returns only the first output node. So what is happening is you are tying to connect the input of the model_merge(input of model1) with the first output node.



          To following code shows this. Individual output nodes of the layer can be accessed using the get_output_at() method of the layer.



          layer_output = model_merge.layers[-2].output # The first output node
          layer_output_1 = model_merge.layers[-2].get_output_at(0) # The first output node
          layer_output_2 = model_merge.layers[-2].get_output_at(1) # The second output node


          Now the following two codes throws error because the graph is disconnected.



          test2 = Model(inputs=model_merge.input, outputs=layer_output)


          and



          test2 = Model(inputs=model_merge.input, outputs=layer_output_1)


          But the code below doesnot throw error, because the graph is connected.



          test2 = Model(inputs=model_merge.input, outputs=layer_output_2)






          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Mar 9 at 7:33









          MitikuMitiku

          2,3631518




          2,3631518












          • Thanks a lot. So what's happening is: I defined this layer first time in model 2 so it generated the first output node connecting the input and output of model2 , then I created model_merge so it generated second output node connecting the output of model 1 and the output of model2, right?

            – haojie yuan
            Mar 9 at 8:02












          • Yes, that is exactly what is happening.

            – Mitiku
            Mar 9 at 8:46

















          • Thanks a lot. So what's happening is: I defined this layer first time in model 2 so it generated the first output node connecting the input and output of model2 , then I created model_merge so it generated second output node connecting the output of model 1 and the output of model2, right?

            – haojie yuan
            Mar 9 at 8:02












          • Yes, that is exactly what is happening.

            – Mitiku
            Mar 9 at 8:46
















          Thanks a lot. So what's happening is: I defined this layer first time in model 2 so it generated the first output node connecting the input and output of model2 , then I created model_merge so it generated second output node connecting the output of model 1 and the output of model2, right?

          – haojie yuan
          Mar 9 at 8:02






          Thanks a lot. So what's happening is: I defined this layer first time in model 2 so it generated the first output node connecting the input and output of model2 , then I created model_merge so it generated second output node connecting the output of model 1 and the output of model2, right?

          – haojie yuan
          Mar 9 at 8:02














          Yes, that is exactly what is happening.

          – Mitiku
          Mar 9 at 8:46





          Yes, that is exactly what is happening.

          – Mitiku
          Mar 9 at 8:46



















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55073785%2fdisconnected-graph-when-concatenating-two-models%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Can't initialize raids on a new ASUS Prime B360M-A motherboard2019 Community Moderator ElectionSimilar to RAID config yet more like mirroring solution?Can't get motherboard serial numberWhy does the BIOS entry point start with a WBINVD instruction?UEFI performance Asus Maximus V Extreme

          Identity Server 4 is not redirecting to Angular app after login2019 Community Moderator ElectionIdentity Server 4 and dockerIdentityserver implicit flow unauthorized_clientIdentityServer Hybrid Flow - Access Token is null after user successful loginIdentity Server to MVC client : Page Redirect After loginLogin with Steam OpenId(oidc-client-js)Identity Server 4+.NET Core 2.0 + IdentityIdentityServer4 post-login redirect not working in Edge browserCall to IdentityServer4 generates System.NullReferenceException: Object reference not set to an instance of an objectIdentityServer4 without HTTPS not workingHow to get Authorization code from identity server without login form

          2005 Ahvaz unrest Contents Background Causes Casualties Aftermath See also References Navigation menue"At Least 10 Are Killed by Bombs in Iran""Iran"Archived"Arab-Iranians in Iran to make April 15 'Day of Fury'"State of Mind, State of Order: Reactions to Ethnic Unrest in the Islamic Republic of Iran.10.1111/j.1754-9469.2008.00028.x"Iran hangs Arab separatists"Iran Overview from ArchivedConstitution of the Islamic Republic of Iran"Tehran puzzled by forged 'riots' letter""Iran and its minorities: Down in the second class""Iran: Handling Of Ahvaz Unrest Could End With Televised Confessions""Bombings Rock Iran Ahead of Election""Five die in Iran ethnic clashes""Iran: Need for restraint as anniversary of unrest in Khuzestan approaches"Archived"Iranian Sunni protesters killed in clashes with security forces"Archived