Комментарии:
Hey Dilmer! Sorry to bother you again. Really want to register for your XR course. When will it be possible ?
ОтветитьI love this kind of Tutorials that feature A.I. tools working together with Voice Recognition to make the Game / App more user friendly.
Very useful. Thank you Bro. ✌🏻
make mmore videos on unity with magic leaps 2
ОтветитьLove the videos! I just had to know, is your VSCode the default theme or is it something custom? I appreciate the awesome VR content you provide my man!
Ответить.NET and Unity. Both of them has access to built in Voic Recognition. So whats so much better using this EDK?
ОтветитьGreat video! Could u help me correctly attach an XR reference library from an assetbundle in AR foundation? I am able to get the library and attach it in unity but on android arcore complains that the library doesn't have any arcore data even though the markers are visible in the library in UNITY EDITOR.
ОтветитьThere is no shape controller. Where do you implement that?
ОтветитьThanks again Dilmer! Can you please also make a video on integrating Whisper by OpenAI with Unity?
ОтветитьCould you please create ar voice command image display using vuforia and voice sdk and wit ai
Ответитьnice bro
ОтветитьHow do we get past the limit of 60 requests per minute per app?
ОтветитьIt works for me in the editor but it doesnt when i build it into my quest 2
ОтветитьThanks for the tutorial. Most of it works, but using the response handler the words[] array did not work. On my first attempt it contained the text 'value', whatever I would say and after trying the Meta tutorial as well the words array contained the intent name. But not the actual uttered word. The full transcript and partial transcipt works fine. Any thoughts on what I'm doing wrong?
Ответить