Google Assistant request is INTENT, so nlpjs not called, nluData not created

google-assistant

#1

Take a Google Assistant action, with a single intent and a single Type collecting all text.
This is sending a Request of type INTENT and already contains an Intent and Entities properties.

So jovo interpretation module does not call nlpjsnlu.processText.
this.$input.entities contains the full utterance, the GA Type collecting all/any text.
this.$input does not contain a nluData property.
this.$input contains intent and entities from GA, not from nlpjs.

  • Adding INTENT to supportingTypes does not call the nlpjsnlu
  • Calling nlpjsnlu.processText directly fails.
  • nlpjsnlu is not added to GoogleAssistantPlatform.plugins, it is {}, even when added in the config.

How can we call nlpjsnlu the parse the complete utterance phrase supplied by GA? and thus get a nluData property containing nlpjs intent and entities?


#2

Hey @David_MacDougall,

Thank you for the details! Could you show how you’re adding NLPjs to the Google Assistant platform? Pasting the relevant part of your app.ts code would be ideal.


#3

Hi Jan

First I declare nlpjs… Here I’ve added INTENT but it makes no difference in GoogleAssistantPlatform if it is included or not.

const nlpjs = new NlpjsNlu({
    input: {
      supportedTypes: ['INTENT', 'TEXT', 'TRANSCRIBED_SPEECH', 'SPEECH'],
    },
    languageMap: { 
      en: LangEn,
    },
    preTrainedModelFilePath: './model.nlp',
    useModel: false,
    modelsPath: './models',
});

Then I use it in both GoogleAssistantPlatform and CorePlatform. I’ve since found out nlpjs is the default so not sure if required in CorePlatform, but I’ve tried it in one, then the other, then both, it never shows up in the GAP plugins property, and nluData never appears in this.$input

const app = new App({

  components: [EchoComponent, GlobalComponent],

  plugins: [
    // Add Jovo plugins here
    new GoogleAssistantPlatform({
      plugins: [
        nlpjs
      ],
    }),
    new CorePlatform({
      plugins: [
        nlpjs
      ],
    }),
    new GoogleSheetsCms({
      caching: false,
      serviceAccount: ServiceAccount,
      spreadsheetId: '1Oe_Z.............G3NcUY13Q',
      sheets: {
        translations: new TranslationsSheet(),
      },
    }),
  ],

Does this help?


#4

To the second point, here is my code attempting to call the processText() directly.

I can see that the first argument (jovo) is only used to get the locale, which is never where its looking because the google request is formatted differently so it always defaults to ‘en’. which in this case is fine and should not cause an error, but it still fails, even though its the same function you are calling in the interpretation module.

I realise this is an attempted workaround, but why can’t I call it directly?

app.hook('after.interpretation.end', async (jovo) => {
  let utterance = jovo.$input.text;

  // all references are good entering this next line
  const nluProcessResult = await jovo.$plugins.CorePlatform.plugins.NlpjsNlu.processText(jovo, utterance);
  // fails with Cannot read properties of undefined (reading 'toJSON')
  if (nluProcessResult) {
      jovo.$input.nlu = nluProcessResult;
      jovo.$entities = nluProcessResult.entities || {};
  }
});

#5

@jan can you provide a hint here? Am I going about this all wrong? Should I try and implement nlp.js as a separate npm module and use its nlu outside of jovo/jovo plugins? I’ve been doing other sections, but want to try and implement the nlu side of things in the coming week, do you have any pointers? Thanks