…apparently, Microsoft’s take on it. Looking at their terms of service for Copilot, we read in the original bold…

Copilot is for entertainment purposes only…

While that’s good advice, we are pretty sure we’ve seen people use LLMs, including Copilot, for decidedly non-entertaining tasks. But, at least for now, if you are using Copilot for non-entertainment purposes, you are violating the terms of service.

We get it. They are just covering their… bases. When you do something stupid based on output from Copilot, they can say, “Oh, yeah, that was just for entertainment.” But they know what you are doing, and they even encourage it.
Heck, they’re doing it themselves. Would it stand up in court?..

Now it is true that probably everyone will give you a similar warning. OpenAI, for example, has this to say…
Notice that it doesn’t pretend you are only using it for a chuckle. Anthropic has even more wording, but still stops short of pretending to be a party game. Copilot, on the other hand, is for fun.

Source [web-archive]

  • shyguyblue@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 days ago

    We, as a species, failed to learn anything from the “gps told me to drive into a lake!?” debacle; so here is history, just flat out repeating itself…