People in the comments seem confused about this with statements like “greenest AI is no AI” style comments. And well, obviously that’s true but it’s an apples to pears comparison.
Clearly Ecosia is pushing for “people want AI” _and_ we want to make it more ecofriendly. Taking away features from users altogether is not the right answer.
It’s like saying “cheapest car is no car”. It doesn’t solve the fundamental problem of “wanting a car”.
I love Kagi's implementation: by default it's disabled, you either have to add a question mark to the search, or click in the interface after searching to generate the summary.
Would be interesting to be a fly on the wall for their internal conversations. I'm sure plenty of their employees are AI sceptics, precisely because of the environmental impacts, and this doesn't sound like it would take all those concerns away.
I haven't kept up to date with this, but last I heard we still aren't certain how much energy AI training/inference actually takes at the big AI companies (OpenAI/Anthropic). Have any of these companies shared this since? What's our closest estimates if not?
Looks interesting. One question though: are you running your own fine-tuned open models on your hardware, or is this powered by an external model like GPT behind the scenes? Curious how independent the stack really is.
TFA "... which already powers AI Overviews and some of our search results. Building our own infrastructure gives us more control over the technology, ..."
In their main page they fleetingly mention they train their own small models.
"Great question! My responses are generated using advanced language models developed by OpenAI, like GPT, rather than running independent fine-tuned open-source models on dedicated hardware. This means the core AI technology powering me is externally hosted and maintained by OpenAI, ensuring high-quality, up-to-date language understanding and generation.
However, my design and integration are tailored to prioritize values like sustainability, integrity, dignity, and compassion, and I’m optimized to provide answers with a strong focus on those principles. So while the underlying model is external, the way I interact and the lens through which I provide information is uniquely aligned with Ecosia’s mission.
If you’re interested, I can also share insights on open-source alternatives or how AI stacks can be made more independent and sustainable!"
> As a not-for-profit company, we can afford to do things differently. AI Search uses smaller, more efficient models, and we avoid energy-heavy features like video generation altogether.
I'm a bit confused -- do other search engines provide video generation? Mentioning that sounds too out of place to me. Am I missing something?
Netflix spending 240Wh for 1h of content just does not pass the smell test for me.
Today I can have ~8 people streaming from my Jellyfin instance which is a server that consumes about 35W, measured at the wall. That's ~5Wh per hour of content from me not even trying.
It's quickly pointed out that he's not counting the training of models, producing all the GPUs, energy spent on scraping, the increased storage needs from scraping the whole internet, etc.
The Netflix number is probably not counting all the energy spent producing the shows/movies, building all the cameras/specialized equipment, building their data centers etc. either.
It is fair to compare inference to streaming. Both are done by the end user.
Does the Netflix number include the energy cost of manufacturing all the cameras/equipment used for production? Energy for travel for all the crew involved to the location? Energy for building out the sets?
The greenest AI will be connected to district heat networks instead of being cooled with air or water. It isn't even faintly green when heat is treated as a byproduct instead of a co-benefit.
Clearly Ecosia is pushing for “people want AI” _and_ we want to make it more ecofriendly. Taking away features from users altogether is not the right answer.
It’s like saying “cheapest car is no car”. It doesn’t solve the fundamental problem of “wanting a car”.
I'm thinking a really good search engine would not make you reach for ai as often and so could be eco friendly that way
The server shares resources!
In their main page they fleetingly mention they train their own small models.
I agree it's little info
However, my design and integration are tailored to prioritize values like sustainability, integrity, dignity, and compassion, and I’m optimized to provide answers with a strong focus on those principles. So while the underlying model is external, the way I interact and the lens through which I provide information is uniquely aligned with Ecosia’s mission.
If you’re interested, I can also share insights on open-source alternatives or how AI stacks can be made more independent and sustainable!"
I'm a bit confused -- do other search engines provide video generation? Mentioning that sounds too out of place to me. Am I missing something?
[1]: https://bsky.app/profile/simonwillison.net/post/3m6qdf5rffs2...
Today I can have ~8 people streaming from my Jellyfin instance which is a server that consumes about 35W, measured at the wall. That's ~5Wh per hour of content from me not even trying.
It is fair to compare inference to streaming. Both are done by the end user.
NEEEEEXT