Why this blog?
Until this moment I have been forced to listen while media and politicians alike have told me "what Canadians think". In all that time they never once asked.
This is just the voice of an ordinary Canadian yelling back at the radio -
"You don't speak for me."
email Kate
Goes to a private
mailserver in Europe.
I can't answer or use every tip, but all are appreciated!
Katewerk Art
Support SDA
Paypal:
Etransfers:
katewerk(at)sasktel.net
Not a registered charity.
I cannot issue tax receipts
Favourites/Resources
Instapundit
The Federalist
Powerline Blog
Babylon Bee
American Thinker
Legal Insurrection
Mark Steyn
American Greatness
Google Newspaper Archive
Pipeline Online
David Thompson
Podcasts
Steve Bannon's War Room
Scott Adams
Dark Horse
Michael Malice
Timcast
@Social
@Andy Ngo
@Cernovich
@Jack Posobeic
@IanMilesCheong
@AlinaChan
@YuriDeigin
@GlenGreenwald
@MattTaibbi
Support Our Advertisers

Sweetwater

Don't Run

Polar Bear Evolution

Email the Author
Wind Rain Temp
Seismic Map
What They Say About SDA
"Smalldeadanimals doesn't speak for the people of Saskatchewan" - Former Sask Premier Lorne Calvert
"I got so much traffic after your post my web host asked me to buy a larger traffic allowance." - Dr.Ross McKitrick
Holy hell, woman. When you send someone traffic, you send someone TRAFFIC.My hosting provider thought I was being DDoSed. - Sean McCormick
"The New York Times link to me yesterday [...] generated one-fifth of the traffic I normally get from a link from Small Dead Animals." - Kathy Shaidle
"You may be a nasty right winger, but you're not nasty all the time!" - Warren Kinsella
"Go back to collecting your welfare livelihood." - Michael E. Zilkowsky
That’s enough K pop for one night
They put Kamala Harris in a computer?
Re the music rating …
I wasted my afternoon listening to Paul McCartney when I could have been listening to
Paul McFartney. Chat GPT would have rated it higher
Having used commercial AI, I must say these blatherings are not that common when a large-context AI solves a closed problem using more than negligible amount of computing power. With a price of running a V8 for minutes, you’ll get a lot of computing and it is incredible.
The price of the AI comes partly from the fact you’ll need a lot of LLM to debug what one run of an LLM got wrong. It’s easier to make the LLM talk than to make it do a simple calculation, but believe it or not, I’m impressed with my Claude doing fact-finding in more closed problems.
If you give it a very open problem, it might be less useful. But it definitely beats me in many, many things.
Anyway, with an LLM, you better be prepared it feeds you something that looks right and is not.