This past week, ChatGPT went down for all users. According to Gizmodo, “OpenAI’s outages have become somewhat common, which makes it difficult for people to rely on ChatGPT in their workdays.” Now that OpenAI estimates that the free version of ChatGPT has about 100 million weekly active users and there are a growing number of reports revealing just how much energy is already being used to power AI tools.

In the near future, it is likely that the cost of using AI tools may soon be tied to usage, similar to how paying for website hosting is based on web traffic. When this happens more of us may be forced to ask whether we really need or want AI to do a task? In this future, foregoing using AI for a task could be akin to skipping the plastic bag at checkout to save the environment or to save money. If the costs or negative environmental impacts begin to soar (as they are already showing signs of doing) it could also lead to more discussions of AI conservation, reducing our individual consumption of processing power in the same way we talk about reducing our carbon footprint.
TRENDING CURRENTLY
- How MindValley Is Building the Next TED (Only More Useful) »
- What Does Chacha Mean To You? (The Power Of A Name) »
- My 500th Blog Post – A Big Thank You »
- Is “Sludge” a Real Customer Service Tactic to Avoid Irate Customers? »
- Manifesto For The Content Curator: The Next Big Social Media Job Of The Future ? »