

Looks like the title has been updated: “Eating Kosher in the Heart of Syria: Lamb-Stuffed Zucchini but Hold the Yogurt”


Looks like the title has been updated: “Eating Kosher in the Heart of Syria: Lamb-Stuffed Zucchini but Hold the Yogurt”


I have a little more foundation for that.
I think you’re right.


I largely agree, but one situation I can think of where condensing the work makes sense is experienced professionals who already meet the learning outcomes. Their goal is to prove that they know the material, then have a degree to show as proof, not to actually learn the material.


The “paperclip maximizer” is a great, realistic example of how an AI could work to destroy human civilization.


Internet access is increasingly becoming as crucial as any other utility. For some people, losing Internet might interfere with work, school, or medical devices.
It’s not “the end of the world” if my electricity is out for 5 hours, but it might cost me hundreds of dollars in spoiled food. It’s not “the end of the world” of my Internet is out for 5 hours, but it might “cost” me PTO.
One small correction. You switched units. You started with Watt-hours (kWh, energy) and then switched to Watts (GW, power). With the right units, it’s even more dramatic.
There are an average of about 730 hours in a month. If a home consumes 1000 kWh per month, that’s an average of 1.3 kW. If we divide 9 GW by 1.3kW, we get 6.9 million.
So this data center will use the same amount of energy as over 6 million homes. For reference, Utah has a population of 3.5 million (total people, not total number of homes).
Here’s another way of comparing the numbers. If this new data center uses 9 GW of power 24/7, that’s an about 6,500 GWh per month, or a little under 79,000 GWh per year.
In 2025, Utah produced a new record of over 35,000 GWh.
So this data center would more than triple the amount of energy produced in 2025.