5 comments

  • ENGNR 21 days ago
    This looks interesting. If I’m understanding, it takes the raw html elements of a page and saves them into a RAG? Is this because feeding the whole page into an LLM prompt would be bigger than the content window?

    Also love that it has both backend source code and a service available

    • nomad_ankur 20 days ago
      It can work both with html or even the user data in json/array in React/Angular kind of applications.

      Feeding the whole page doesnot always make sense, it can have repetitive menu, and layout stuff. It give developer control on what they want to feed to the prompt. End of the day everything have a cost attached, this will help optimise cost to the last mile.

      Give it a try, would love some feedback.

  • Alifatisk 20 days ago
    What does it mean to have a native ai assistant? Is it like Siri on my iPhone while Google assistant would count as non-native?
    • nomad_ankur 20 days ago
      It deeply integration with inAPP, and have native Functions and Backend APIs. It can perform InApp UI/UX workflows based on user query. A lot of AI assistants are just chatbots which are working in completely isolated thread.
  • lizhenqi 20 days ago
    Can you explain in detail how AI triggers the action of executing developer registration? I'm a bit curious
    • nomad_ankur 20 days ago
      In code various functions are registered with their function and parameter definition. Based on text/voice based query, the framework uses OpenAI function calling to invoke the right function.
  • shaileshjswl 20 days ago
    Can you share some more use cases you've seen it working?
  • purplecats 20 days ago
    do you have a demo to try without building it first