Using Custom vision exported model with tensorflow JS and input an image2019 Community Moderator ElectionTensorflow: how to save/restore a model?tensorflow tutorial of convolution, scale of logitTensorflow - Reusing model InvalidArgumentErrorTransitioning Tensorflow backed Keras model to CoreMLHow to deploy a Tensorflow trained model for inference for a Windows standalone applicationUse Azure custom-vision trained model with tensorflow.jsError converting delf to tensorflow js webHow can I convert a TensorFlow graph .pb file using tensorflowjs_converter?Why model.predict() from tensorflowjs always return same prediction?Loading of mobilenet v2 works, but pretrained mobilenet v2 fails

Should we avoid writing fiction about historical events without extensive research?

Ahoy, Ye Traveler!

Can we carry rice to Japan?

When to use mean vs median

Convergence to a fixed point

Are there other characters in the Star Wars universe who had damaged bodies and needed to wear an outfit like Darth Vader?

It doesn't matter the side you see it

Are all UTXOs locked by an address spent in a transaction?

How do you say “my friend is throwing a party, do you wanna come?” in german

How to kill a localhost:8080

PTIJ: Should I stay away from my computer?

PTIJ: Aharon, King of Egypt

If nine coins are tossed, what is the probability that the number of heads is even?

How can neutral atoms have exactly zero electric field when there is a difference in the positions of the charges?

Caulking a corner instead of taping with joint compound?

Why are special aircraft used for the carriers in the United States Navy?

How can I highlight parts in a screenshot

Giving a talk in my old university, how prominently should I tell students my salary?

Formatting a table to look nice

Was it really inappropriate to write a pull request for the company I interviewed with?

Relationship between the symmetry number of a molecule as used in rotational spectroscopy and point group

How to disable or uninstall iTunes under High Sierra without disabling SIP

1970s scifi/horror novel where protagonist is used by a crablike creature to feed its larvae, goes mad, and is defeated by retraumatising him

Is every open circuit a capacitor?



Using Custom vision exported model with tensorflow JS and input an image



2019 Community Moderator ElectionTensorflow: how to save/restore a model?tensorflow tutorial of convolution, scale of logitTensorflow - Reusing model InvalidArgumentErrorTransitioning Tensorflow backed Keras model to CoreMLHow to deploy a Tensorflow trained model for inference for a Windows standalone applicationUse Azure custom-vision trained model with tensorflow.jsError converting delf to tensorflow js webHow can I convert a TensorFlow graph .pb file using tensorflowjs_converter?Why model.predict() from tensorflowjs always return same prediction?Loading of mobilenet v2 works, but pretrained mobilenet v2 fails










6















I'm new to tensorflow.js and tensorflow



The context : We have trained a model using custom vision to recognized from an image, the hairlength : short, mid, long. This model was exported and we would like to use it in local with tensorflow js. The exported files from custom vision are a *.pb file and a labels.txt file.



I have used tensorflowjs_converter python script, here is the command I have used to convert a frozen model *.pb in a json model :



tensorflowjs_converter --input_format=tf_frozen_model --output_node_names='model_outputs' --output_json OUTPUT_JSON C:pythontf_modelshairlengthmodel.pb C:pythontf_modelsexports


Then I paste this model.json and shards in the assets folder of my angular client. Then I try to load the model and I give him an image to get the prediction but all I get are indexes values that are way out of bound since I only need 0: long, 1: mid, 2: short hairlength. Here is a capture of the console
prediction list



This is the class I have used in my client (typescript) for predictions:



import * as tf from '@tensorflow/tfjs';

// import HAIRLENGTH_LABELS from './hairlength';
import FrozenModel from '@tensorflow/tfjs';

const MODEL = 'assets/models/hairlength/model.json';
const INPUT_NODE_NAME = 'model_outputs';
const OUTPUT_NODE_NAME = 'model_outputs';
const PREPROCESS_DIVISOR = tf.scalar(255 / 2);

export class MobileNetHairLength

private model: FrozenModel;
private labels = ['long', 'mid', 'short'];

constructor()

async load()
this.model = await tf.loadGraphModel(MODEL);


dispose()
if (this.model)
this.model.dispose();



/**
* Infer through MobileNet. This does standard ImageNet pre-processing before
* inferring through the model. This method returns named activations as well
* as softmax logits.
*
* @param input un-preprocessed input Array.
* @return The softmax logits.
*/
predict(input)
const preprocessedInput = tf.div(
tf.sub(input, PREPROCESS_DIVISOR),
PREPROCESS_DIVISOR);
const reshapedInput =
preprocessedInput.reshape([1, ...preprocessedInput.shape]);
// tslint:disable-next-line:no-unused-expression
return this.model.execute([INPUT_NODE_NAME]: reshapedInput, OUTPUT_NODE_NAME);


getTopKClasses(logits, topK: number)
const predictions = tf.tidy(() =>
return tf.softmax(logits);
);

const values = predictions.dataSync();
predictions.dispose();

let predictionList = [];
for (let i = 0; i < values.length; i++)
predictionList.push(value: values[i], index: i);

predictionList = predictionList
.sort((a, b) =>
return b.value - a.value;
)
.slice(0, topK);

console.log(predictionList);
return predictionList.map(x =>
return label: this.labels[x.index], value: x.value;
);




And this the class that calls the above one, I just give the canvas element :



import 'babel-polyfill';
import * as tf from '@tensorflow/tfjs';
import MobileNetHairLength from './mobilenet-hairlength';

export class PredictionHairLength

constructor()

async predict(canvas)
const mobileNet = new MobileNetHairLength();
await mobileNet.load();
const pixels = tf.browser.fromPixels(canvas);

console.log('Prediction');
const result = mobileNet.predict(pixels);
const topK = mobileNet.getTopKClasses(result, 3);

topK.forEach(x =>
console.log( `$x.value.toFixed(3): $x.labeln` );
);

mobileNet.dispose();




My questions are :



Is the convert python command correct ?



Did I miss something in my client to get the correct indexes ?



Thank you for your time and answers



If you need more informations, I would be glad to give them to you










share|improve this question


























    6















    I'm new to tensorflow.js and tensorflow



    The context : We have trained a model using custom vision to recognized from an image, the hairlength : short, mid, long. This model was exported and we would like to use it in local with tensorflow js. The exported files from custom vision are a *.pb file and a labels.txt file.



    I have used tensorflowjs_converter python script, here is the command I have used to convert a frozen model *.pb in a json model :



    tensorflowjs_converter --input_format=tf_frozen_model --output_node_names='model_outputs' --output_json OUTPUT_JSON C:pythontf_modelshairlengthmodel.pb C:pythontf_modelsexports


    Then I paste this model.json and shards in the assets folder of my angular client. Then I try to load the model and I give him an image to get the prediction but all I get are indexes values that are way out of bound since I only need 0: long, 1: mid, 2: short hairlength. Here is a capture of the console
    prediction list



    This is the class I have used in my client (typescript) for predictions:



    import * as tf from '@tensorflow/tfjs';

    // import HAIRLENGTH_LABELS from './hairlength';
    import FrozenModel from '@tensorflow/tfjs';

    const MODEL = 'assets/models/hairlength/model.json';
    const INPUT_NODE_NAME = 'model_outputs';
    const OUTPUT_NODE_NAME = 'model_outputs';
    const PREPROCESS_DIVISOR = tf.scalar(255 / 2);

    export class MobileNetHairLength

    private model: FrozenModel;
    private labels = ['long', 'mid', 'short'];

    constructor()

    async load()
    this.model = await tf.loadGraphModel(MODEL);


    dispose()
    if (this.model)
    this.model.dispose();



    /**
    * Infer through MobileNet. This does standard ImageNet pre-processing before
    * inferring through the model. This method returns named activations as well
    * as softmax logits.
    *
    * @param input un-preprocessed input Array.
    * @return The softmax logits.
    */
    predict(input)
    const preprocessedInput = tf.div(
    tf.sub(input, PREPROCESS_DIVISOR),
    PREPROCESS_DIVISOR);
    const reshapedInput =
    preprocessedInput.reshape([1, ...preprocessedInput.shape]);
    // tslint:disable-next-line:no-unused-expression
    return this.model.execute([INPUT_NODE_NAME]: reshapedInput, OUTPUT_NODE_NAME);


    getTopKClasses(logits, topK: number)
    const predictions = tf.tidy(() =>
    return tf.softmax(logits);
    );

    const values = predictions.dataSync();
    predictions.dispose();

    let predictionList = [];
    for (let i = 0; i < values.length; i++)
    predictionList.push(value: values[i], index: i);

    predictionList = predictionList
    .sort((a, b) =>
    return b.value - a.value;
    )
    .slice(0, topK);

    console.log(predictionList);
    return predictionList.map(x =>
    return label: this.labels[x.index], value: x.value;
    );




    And this the class that calls the above one, I just give the canvas element :



    import 'babel-polyfill';
    import * as tf from '@tensorflow/tfjs';
    import MobileNetHairLength from './mobilenet-hairlength';

    export class PredictionHairLength

    constructor()

    async predict(canvas)
    const mobileNet = new MobileNetHairLength();
    await mobileNet.load();
    const pixels = tf.browser.fromPixels(canvas);

    console.log('Prediction');
    const result = mobileNet.predict(pixels);
    const topK = mobileNet.getTopKClasses(result, 3);

    topK.forEach(x =>
    console.log( `$x.value.toFixed(3): $x.labeln` );
    );

    mobileNet.dispose();




    My questions are :



    Is the convert python command correct ?



    Did I miss something in my client to get the correct indexes ?



    Thank you for your time and answers



    If you need more informations, I would be glad to give them to you










    share|improve this question
























      6












      6








      6








      I'm new to tensorflow.js and tensorflow



      The context : We have trained a model using custom vision to recognized from an image, the hairlength : short, mid, long. This model was exported and we would like to use it in local with tensorflow js. The exported files from custom vision are a *.pb file and a labels.txt file.



      I have used tensorflowjs_converter python script, here is the command I have used to convert a frozen model *.pb in a json model :



      tensorflowjs_converter --input_format=tf_frozen_model --output_node_names='model_outputs' --output_json OUTPUT_JSON C:pythontf_modelshairlengthmodel.pb C:pythontf_modelsexports


      Then I paste this model.json and shards in the assets folder of my angular client. Then I try to load the model and I give him an image to get the prediction but all I get are indexes values that are way out of bound since I only need 0: long, 1: mid, 2: short hairlength. Here is a capture of the console
      prediction list



      This is the class I have used in my client (typescript) for predictions:



      import * as tf from '@tensorflow/tfjs';

      // import HAIRLENGTH_LABELS from './hairlength';
      import FrozenModel from '@tensorflow/tfjs';

      const MODEL = 'assets/models/hairlength/model.json';
      const INPUT_NODE_NAME = 'model_outputs';
      const OUTPUT_NODE_NAME = 'model_outputs';
      const PREPROCESS_DIVISOR = tf.scalar(255 / 2);

      export class MobileNetHairLength

      private model: FrozenModel;
      private labels = ['long', 'mid', 'short'];

      constructor()

      async load()
      this.model = await tf.loadGraphModel(MODEL);


      dispose()
      if (this.model)
      this.model.dispose();



      /**
      * Infer through MobileNet. This does standard ImageNet pre-processing before
      * inferring through the model. This method returns named activations as well
      * as softmax logits.
      *
      * @param input un-preprocessed input Array.
      * @return The softmax logits.
      */
      predict(input)
      const preprocessedInput = tf.div(
      tf.sub(input, PREPROCESS_DIVISOR),
      PREPROCESS_DIVISOR);
      const reshapedInput =
      preprocessedInput.reshape([1, ...preprocessedInput.shape]);
      // tslint:disable-next-line:no-unused-expression
      return this.model.execute([INPUT_NODE_NAME]: reshapedInput, OUTPUT_NODE_NAME);


      getTopKClasses(logits, topK: number)
      const predictions = tf.tidy(() =>
      return tf.softmax(logits);
      );

      const values = predictions.dataSync();
      predictions.dispose();

      let predictionList = [];
      for (let i = 0; i < values.length; i++)
      predictionList.push(value: values[i], index: i);

      predictionList = predictionList
      .sort((a, b) =>
      return b.value - a.value;
      )
      .slice(0, topK);

      console.log(predictionList);
      return predictionList.map(x =>
      return label: this.labels[x.index], value: x.value;
      );




      And this the class that calls the above one, I just give the canvas element :



      import 'babel-polyfill';
      import * as tf from '@tensorflow/tfjs';
      import MobileNetHairLength from './mobilenet-hairlength';

      export class PredictionHairLength

      constructor()

      async predict(canvas)
      const mobileNet = new MobileNetHairLength();
      await mobileNet.load();
      const pixels = tf.browser.fromPixels(canvas);

      console.log('Prediction');
      const result = mobileNet.predict(pixels);
      const topK = mobileNet.getTopKClasses(result, 3);

      topK.forEach(x =>
      console.log( `$x.value.toFixed(3): $x.labeln` );
      );

      mobileNet.dispose();




      My questions are :



      Is the convert python command correct ?



      Did I miss something in my client to get the correct indexes ?



      Thank you for your time and answers



      If you need more informations, I would be glad to give them to you










      share|improve this question














      I'm new to tensorflow.js and tensorflow



      The context : We have trained a model using custom vision to recognized from an image, the hairlength : short, mid, long. This model was exported and we would like to use it in local with tensorflow js. The exported files from custom vision are a *.pb file and a labels.txt file.



      I have used tensorflowjs_converter python script, here is the command I have used to convert a frozen model *.pb in a json model :



      tensorflowjs_converter --input_format=tf_frozen_model --output_node_names='model_outputs' --output_json OUTPUT_JSON C:pythontf_modelshairlengthmodel.pb C:pythontf_modelsexports


      Then I paste this model.json and shards in the assets folder of my angular client. Then I try to load the model and I give him an image to get the prediction but all I get are indexes values that are way out of bound since I only need 0: long, 1: mid, 2: short hairlength. Here is a capture of the console
      prediction list



      This is the class I have used in my client (typescript) for predictions:



      import * as tf from '@tensorflow/tfjs';

      // import HAIRLENGTH_LABELS from './hairlength';
      import FrozenModel from '@tensorflow/tfjs';

      const MODEL = 'assets/models/hairlength/model.json';
      const INPUT_NODE_NAME = 'model_outputs';
      const OUTPUT_NODE_NAME = 'model_outputs';
      const PREPROCESS_DIVISOR = tf.scalar(255 / 2);

      export class MobileNetHairLength

      private model: FrozenModel;
      private labels = ['long', 'mid', 'short'];

      constructor()

      async load()
      this.model = await tf.loadGraphModel(MODEL);


      dispose()
      if (this.model)
      this.model.dispose();



      /**
      * Infer through MobileNet. This does standard ImageNet pre-processing before
      * inferring through the model. This method returns named activations as well
      * as softmax logits.
      *
      * @param input un-preprocessed input Array.
      * @return The softmax logits.
      */
      predict(input)
      const preprocessedInput = tf.div(
      tf.sub(input, PREPROCESS_DIVISOR),
      PREPROCESS_DIVISOR);
      const reshapedInput =
      preprocessedInput.reshape([1, ...preprocessedInput.shape]);
      // tslint:disable-next-line:no-unused-expression
      return this.model.execute([INPUT_NODE_NAME]: reshapedInput, OUTPUT_NODE_NAME);


      getTopKClasses(logits, topK: number)
      const predictions = tf.tidy(() =>
      return tf.softmax(logits);
      );

      const values = predictions.dataSync();
      predictions.dispose();

      let predictionList = [];
      for (let i = 0; i < values.length; i++)
      predictionList.push(value: values[i], index: i);

      predictionList = predictionList
      .sort((a, b) =>
      return b.value - a.value;
      )
      .slice(0, topK);

      console.log(predictionList);
      return predictionList.map(x =>
      return label: this.labels[x.index], value: x.value;
      );




      And this the class that calls the above one, I just give the canvas element :



      import 'babel-polyfill';
      import * as tf from '@tensorflow/tfjs';
      import MobileNetHairLength from './mobilenet-hairlength';

      export class PredictionHairLength

      constructor()

      async predict(canvas)
      const mobileNet = new MobileNetHairLength();
      await mobileNet.load();
      const pixels = tf.browser.fromPixels(canvas);

      console.log('Prediction');
      const result = mobileNet.predict(pixels);
      const topK = mobileNet.getTopKClasses(result, 3);

      topK.forEach(x =>
      console.log( `$x.value.toFixed(3): $x.labeln` );
      );

      mobileNet.dispose();




      My questions are :



      Is the convert python command correct ?



      Did I miss something in my client to get the correct indexes ?



      Thank you for your time and answers



      If you need more informations, I would be glad to give them to you







      angular tensorflow tensorflow.js tensorflowjs-converter microsoft-custom-vision






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked 17 hours ago









      RLorisRLoris

      926




      926






















          1 Answer
          1






          active

          oldest

          votes


















          0














          I think it's a simple bug in your code: const INPUT_NODE_NAME = 'model_outputs'; should probably be 'model_inputs' or whatever it actually is. Here, you're setting the output to be the input image and then reading it back without predicting anything.






          share|improve this answer






















            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55020267%2fusing-custom-vision-exported-model-with-tensorflow-js-and-input-an-image%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            I think it's a simple bug in your code: const INPUT_NODE_NAME = 'model_outputs'; should probably be 'model_inputs' or whatever it actually is. Here, you're setting the output to be the input image and then reading it back without predicting anything.






            share|improve this answer



























              0














              I think it's a simple bug in your code: const INPUT_NODE_NAME = 'model_outputs'; should probably be 'model_inputs' or whatever it actually is. Here, you're setting the output to be the input image and then reading it back without predicting anything.






              share|improve this answer

























                0












                0








                0







                I think it's a simple bug in your code: const INPUT_NODE_NAME = 'model_outputs'; should probably be 'model_inputs' or whatever it actually is. Here, you're setting the output to be the input image and then reading it back without predicting anything.






                share|improve this answer













                I think it's a simple bug in your code: const INPUT_NODE_NAME = 'model_outputs'; should probably be 'model_inputs' or whatever it actually is. Here, you're setting the output to be the input image and then reading it back without predicting anything.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered 9 hours ago









                David SoergelDavid Soergel

                1,0311814




                1,0311814





























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55020267%2fusing-custom-vision-exported-model-with-tensorflow-js-and-input-an-image%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Can't initialize raids on a new ASUS Prime B360M-A motherboard2019 Community Moderator ElectionSimilar to RAID config yet more like mirroring solution?Can't get motherboard serial numberWhy does the BIOS entry point start with a WBINVD instruction?UEFI performance Asus Maximus V Extreme

                    Identity Server 4 is not redirecting to Angular app after login2019 Community Moderator ElectionIdentity Server 4 and dockerIdentityserver implicit flow unauthorized_clientIdentityServer Hybrid Flow - Access Token is null after user successful loginIdentity Server to MVC client : Page Redirect After loginLogin with Steam OpenId(oidc-client-js)Identity Server 4+.NET Core 2.0 + IdentityIdentityServer4 post-login redirect not working in Edge browserCall to IdentityServer4 generates System.NullReferenceException: Object reference not set to an instance of an objectIdentityServer4 without HTTPS not workingHow to get Authorization code from identity server without login form

                    2005 Ahvaz unrest Contents Background Causes Casualties Aftermath See also References Navigation menue"At Least 10 Are Killed by Bombs in Iran""Iran"Archived"Arab-Iranians in Iran to make April 15 'Day of Fury'"State of Mind, State of Order: Reactions to Ethnic Unrest in the Islamic Republic of Iran.10.1111/j.1754-9469.2008.00028.x"Iran hangs Arab separatists"Iran Overview from ArchivedConstitution of the Islamic Republic of Iran"Tehran puzzled by forged 'riots' letter""Iran and its minorities: Down in the second class""Iran: Handling Of Ahvaz Unrest Could End With Televised Confessions""Bombings Rock Iran Ahead of Election""Five die in Iran ethnic clashes""Iran: Need for restraint as anniversary of unrest in Khuzestan approaches"Archived"Iranian Sunni protesters killed in clashes with security forces"Archived