• gerryflap@feddit.nl
    link
    fedilink
    arrow-up
    3
    ·
    1 month ago

    No. ChatGPT pulls information out of its ass and how I read it SearchGPT actually links to sources (while also summarizing it and pulling information out of it’s ass, presumably). ChatGPT “knows” things and SearchGPT should actually look stuff up and present it to you.

    • kosmoz@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      Kagi supports this since a while. You can end your query with a question mark to request a “quick answer” generated using an llm, complete with sources and citations. It’s surprisingly accurate and useful!

    • helenslunch@feddit.nl
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      1 month ago

      ChatGPT “knows” things and SearchGPT should actually look stuff up and present it to you.

      …where do you think CGPT gets the information it “knows” from?

      • gerryflap@feddit.nl
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        1 month ago

        From the train dataset that was frozen many years ago. It’s like you know something instead of looking it up. It doesn’t provide sources, it just makes shit up based on what was in the (old) dataset. That’s totally different than looking up the information based on what you know and then using the new information to create an informed answer backed up by sources