April 20, 2025
From The Verge earlier this week:
During Meta’s antitrust trial today, lawyers representing Apple, Google, and Snap each expressed irritation with Meta over the slides it presented on Monday that The Verge found to contain easy-to-remove redactions. Attorneys for both Apple and Snap called the errors “egregious,” with Apple’s representative indicating that it may not be able to trust Meta with its internal information in the future. Google’s attorney also blamed Meta for jeopardizing the search giant’s data with the mistake.
Presumably, the folks from Meta used a PDF editor to draw a black vector box around the sections to be redacted. The problem with this technique is that you can open up that same PDF in an editor and delete the box to see what’s underneath.
The correct solution would have been to rasterize the PDF (which is the process of turning vectors to pixels), so any attempt to remove the redacting box would reveal an empty area. When you rasterize a page, you’re essentially baking the PDF.
As John Gruber notes:
You can properly redact a PDF digitally, but botched digital redactions are so commonplace (and at times disastrous and/or humiliating) that when then Attorney General William Barr released the Mueller Report in 2019, the DOJ printed the unredacted original, did the redactions on paper, and then scanned it back in to create the redacted PDF.
But there’s an easier way: use Retrobatch of course!
The workflow would look something like this:
You read the PDF, split the pages up, add a matte background to each page which also happens to rasterize it as well, paste the pages back together with the PDF Maker node, and then write the PDF back out.
You can also control the resolution of the rasterization by using the “Set DPI” node before the page splitter.
I think there’s enough specialized PDF tasks that need to be done that I should probably make whole PDF category in Retrobatch, including a standard PDF rasterization node.
April 18, 2025
Gabriel Nicholas at Wired: The Subjective Charms of Objective-C
But the longer I spent writing Objective-C, the more I felt it hid rather than revealed. Long, sentence-like function names buried the most pertinent information under a fog of dependent clauses. Small features required long-winded pull requests, making it easy for engineers to get distracted during reviews and to miss bugs. Objective-C’s excess words, multiplied across thousands of files and millions of lines of code, made for an exhausting codebase.
My own experience with Objective-C has been very different. I wonder if that’s because I work as a solo developer, and the architecture of my apps has always been stable? I always found the early mantra “If it feels hard, you’re probably doing it wrong” when working with AppKit and Objective-C to be more true than not.
Anytime I hit a stumbling block something like “The Way of the Code Samurai” from Wil Shipley would play through my head. Were people who disliked Objective-C fighting it rather than flowing with it?
To me, Objective-C has always felt expressive and capable, doubly so when I first started using it. After coding in Java for years I felt like I could fly.
Swift is the thing now, and both Acorn and Retrobatch use it for parts. But Swift is a heavy and unsettled language, not to mention extremely slow to compile.
I hope someday we’ll get a version of Swift that isn’t chasing whatever the hot new coding paradigm currently is, and isn’t weighed down by ever expanding complexity. I think that could be pretty nice.
Chris Lattner, the creator of Swift, in an interview:
“Swift, the original idea was factor complexity (…) massively failed, in my opinion (…) Swift has turned into a gigantic, super complicated bag of special cases, special syntax, special stuff”
I wonder, what comes after Swift?
April 17, 2025
Alex Harri: A Flowing WebGL Gradient, Deconstructed:
This effect is written in a WebGL shader using noise functions and some clever math.
In this post, I’ll break it down step by step. You need no prior knowledge of WebGL or shaders — we’ll start by building a mental model for writing shaders and then recreate the effect from scratch.
This was an absolutely wonderful read on constructing a nice looking animated WebGL shader, from the very basics up to the end product.
New to me in this post was the concept of stacking sine waves — what a clever idea.
You might remember Harri’s post “The Engineering behind Figma’s Vector Networks” from back in 2019.
April 14, 2025
Geoffrey Litt: Stevens: a hackable AI assistant using a single SQLite table and a handful of cron jobs
The assistant is called Stevens, named after the butler in the great Ishiguro novel Remains of the Day. Every morning it sends a brief to me and my wife via Telegram, including our calendar schedules for the day, a preview of the weather forecast, any postal mail or packages we’re expected to receive, and any reminders we’ve asked it to keep track of. All written up nice and formally, just like you’d expect from a proper butler.
SQLite, cron, and open APIs. This is the type of hacking that I really dig and I've considered putting together as well.
This passage from the end really resonated with me:
I’ve written before about how the endgame for AI-driven personal software isn’t more app silos, it’s small tools operating on a shared pool of context about our lives.
I keep on coming back to the idea that I need to gather more data about myself and store it somewhere easily accessible by custom AI tools. I write a little bit of stuff in Day One, but I keep meaning to build something on top of SQLite as well. Of course, I also keep on hoping Apple would do the same, but they'd probably move too slow to make it interesting.
If only I could clone myself. I have a ton of ideas but not enough time to implement them all. Such is the beauty of life i guess.
April 11, 2025
With the release of Acorn 8 last December, I published "ACTN002 Acorn's Native File Format" as part of the documentation updates, which is exactly what it sounds like.
Without going into details (that's what the technote is for), Acorn's file format is a SQLite database, with a simple three-table schema, containing TIFF or PNG bitmaps to represent bitmap layers, and a plist to represent shape layers. Acorn has kept this simple format since version 2.0 back in 2009.
And since the format is a SQLite database, it is incredibly easy for a programmer or anyone else who isn't afraid of Terminal.app to get a composite out of an Acorn file:
echo "select writefile('/tmp/pizza.png', value) from image_attributes where name = 'composite'" | sqlite3 pizza.acorn
That's it. You've now got a PNG copy of the Acorn file "pizza.acorn
" written to /tmp/pizza.png
.
SQLite is bundled with pretty much everything these days, which means you can write some code in Python, Swift, Objective-C, whatever, and easily support reading Acorn files. Here's an incredibly short Python script to do that:
import sqlite3
import sys
conn = sqlite3.connect(sys.argv[1])
cursor = conn.cursor()
cursor.execute("select value from image_attributes where name = 'composite'")
result = cursor.fetchone()
with open(sys.argv[2], "wb") as f:
f.write(result[0])
Note: you should really perform some error checking in actual Python code.
What about in Swift? That's easy too.
This file format has worked well in Acorn for 16 years now, and I plan on keeping it the same moving forward.
March 26, 2025
Last week I bought a 13" MacBook Air in Midnight (24GB memory, 512GB SSD).
I hadn't been planning on buying it. Instead, I was expecting to upgrade my current desktop (M1 Ultra) to an M4 Ultra later this year. Assuming, of course, that we would see an M4 Ultra later this year. But as we know, that didn't happen. (Maybe we still will*?)
At any rate. This machine. A 13" Midnight MacBook Air.
It's beautiful. Have you seen it yet?
I ended up buying one because it's cheap, and I haven't had a travel laptop in a while. This new laptop could also double as a new build server to replace my M1 Mac mini, and I figured someday I'll hand it down to my daughter.
The size is perfect for what I'm after. I can use my 13" iPad Pro as a second display, with the bonus that I've now got instant stylus support in Acorn because of that.
And it's so amazingly fast.
I wasn't expecting that last part. So fast.
Its name is Jimi by the way.
What I'm personally interested in is, how fast can Jimi build Acorn (~200k lines of code). Sure, the single-threaded performance of the M4 processor will certainly beat my M1 Ultra (named "SRV"), but the Ultra has so much more RAM and CPU cores. How close will the Air match the performance of my desktop?
With a full build of Acorn, including running hundreds of regression tests, Jimi outperforms my M1 Ultra at 3m21s vs 4m43s. And when purely compiling Acorn, where you'd think the Ultra would have an edge, I get 1:36 (Air) vs 2:05 (Ultra).
I'm sorry, what?
This $1400 machine is beating my $4000 desktop machine with a 20 core CPU, 48 core GPU, and 64GB of memory? What why how?
So I'm pretty happy with this dinky little travel / build machine. It's a joy to hold and fun to use.
*
(As an aside, there's a lot of speculation as to what is going on with the M4 ultra. Does it take a long time to design? Is it just not a priority? Is there something bigger and better coming for both the Studio and Mac Pro? My guess is on the last option. The Ultra is awesome, but I feel like it might be time for Apple to make a workstation specific processor.)
March 14, 2025
There's been a lot flying around the social web the past couple of days about Apple completely botching their AI push, and I haven't seen a whole lot of solutions (I fully admit I could completely be missing it). But off the top of my head, here's one idea that I think could really help and reap benefits for both Apple and developers.
Build a semantic index (SI), and allow apps to access it via permissions given similar to what we do for Address Book or Photos.
Maybe even make the permissions to the SI a bit more fine-grained than you normally would for other personal databases. Historical GPS locations? Scraping contents of the screen over time? Indexed contents of document folder(s)? Make these options for what goes into the SI.
And of course, the same would be true for building the SI. As a user, I'd love to be able to say "sure, capture what's on the screen and scrape the text out of that, but nope - you better not track where I've been over time".
And similar to the Spotlight indexing API, developers should be able to provide data to the SI along with rich metadata. Rev the Spotlight plugin API so that it can do more, or come up with a new API.
Is this information collected for the SI going to be the most sensitive bucket of bits on your device? Yes, of course it is.
But give developers the opportunity, and then customers will have something to choose from. Make the Mac and iOS the best platform to build personalized LLMs.
Let the apps die and live based on their own merit and reputation. Apple can build the platform, and maybe expand on it over time and use it themselves.
I want to see the apps that are made outside of Cupertino. I want to see what can happen when developers have a solid foundation to build on.
March 10, 2025
As my usage of LLMs has been increasing lately, I find myself more and more frustrated with Siri, specifically on the Mac.
As a Mac user, I have this incredible wealth of GPU and CPU power, which in turn allows me to run LLMs locally.
A few weeks ago, before a trip out of the country for my daughter's spring break, I set up a local instance of DeepSeek and made sure I could connect to it via Tailscale running on my Mac.
Why did I do this? Two reasons.
The first was because I could and there's something inherently cool and fun about running these models locally. It's a joy to play around with this stuff.
The second was a tinge of paranoia. What if I wasn't able to access the models I usually use from out of the country? LLMs are so useful for so many things, I really don't want to lose access now that I know about them. Yes, I could route all requests through my VPN, but … still, what if I couldn't?
So I can run models locally on my M1 Mac, and while it's not as fast as running it on Anthropic or OpenAI's servers, it was still usable. Which is mind blowing to me. I honestly never expected to see this tech in my lifetime. (Yes, LLMs get a lot wrong, but they also get so many things right and help me out with tedious coding chores).
A week or so ago I was grousing to some friends that Apple needs to open up things on the Mac so other LLMs can step in where Siri is failing. In theory we (developers) could do this today, but I would love to see a blessed system where Apple provided APIs to other LLM providers.
Are there security concerns? Yes, of course there are, there always will be. But I would like the choice.
The crux of the issue in my mind is this: Apple has a lot of good ideas, but they don't have a monopoly on them. I would like some other folks to come in and try their ideas out. I would like things to advance at the pace of the industry, and not Apple's. Maybe with a blessed system in place, Apple could watch and see how people use LLMs and other generative models (instead of giving us Genmoji that look like something Fisher-Price would make). And maybe open up the existing Apple-only models to developers. There are locally installed image processing models that I would love to take advantage of in my apps.
I'm glad I'm not the only one thinking about this. Ben Thompson writes at the end of Apple AI’s Platform Pivot Potential:
This doesn’t necessarily preclude finally getting new Siri to work; the opportunity Apple is pursuing continues to make sense. At the same time, the implication of the company’s differentiation shifting to hardware is that the most important job for Apple’s software is to get out of the way;
This passage isn't the crux of the article, but it really resonated with me, and I hope it does with some folks inside Apple as well.
…
(Update) Manton Reese is thinking along the same lines: Apple's response to AI:
I’m not sure Apple knows what a big risk they are taking by letting OpenAI and others lap them in the AI race. It’s a risk that will pay off if they can execute. Just as likely, though, we are seeing such a disruption in computing that Apple is vulnerable for the first time in a decade.
March 6, 2025
Acorn 8.1 is out. Full release notes are available as well.
For a .1 update, there are a bunch of new features and improvements. I think I was riding on the high of a great 8.0 release and felt compelled to keep on adding cool stuff.
As already mentioned, Acorn 8.1 includes a new scrub zoom which has been a long-standing request.
Another long-standing request included in 8.1 is the ability to resize selections using on-canvas handles, or via the palette.
Autosave has also had a revamp. There are three options now: "Off", "Native Acorn Images", and "All Images". The default is set to saving native images (.acorn
).
In Acorn 8.0 (and previous versions), when autosave was enabled, non-native files (.jpeg
, .png
) would open without a reference to the original file on disk. This is no longer the case in Acorn 8.1, where non-native files open with a reference to the original file, and pressing ⌘S will save back to the original, regardless of the autosave setting.
Why the change? I found myself wanting autosaving of files where full fidelity would always be preserved (which is what happens when you save .acorn
files), but that behavior didn't always make sense when opening a .jpeg
file. JPEG files are lossy, so opening and saving the image multiple times would degrade the quality of the image. That's not awesome. And you would also lose the edibility of text and layers.
To make the autosave behavior work with multiple file types took a bit of runtime dynamics, especially since I wanted everything to work seamlessly with the macOS frameworks and versions support. I eventually got there with a bit of help from Dave DeLong, which was much appreciated. I had a solution, but I don't think it was nearly as good as what Dave came up with.
There are also a handful of bug fixes and other improvements that are worth looking over the release notes for.
I've also been regularly updating the documentation, and any changes of note get mentioned in the update log of the docs.
What's next? I plan on giving Retrobatch a bit of attention. It's a fun app to work on as well, and there's always common functionality that can pass back and forth with Acorn.
February 27, 2025
Charlie Monroe: A few words about indie app business:
A while ago, someone asked me for advice about starting an indie app business. So I’d given it some thought and wrote back a few points – and I thought it would be a good idea to put it out here for anyone else who’s thinking about doing this. Let’s start from the beginning.
It's a good read and any indie developer should put it on their list.
This part made me laugh:
Or just have a complete exit strategy – I eventually plan on getting a woodworking workshop and building furniture. Depends on what floats your boat.
Sounds familiar.
via DF.
January 28, 2025
A feature that's been asked for in Acorn, for probably over a decade now, is the ability to scrub zoom. This is where you can click and drag left or right with the zoom tool selected, and your image is zoomed in or out. It's a great feature and I'm happy to say that I've recently added it for the upcoming release of Acorn 8.1. And you can try it out right now if you'd like, by downloading a preview release of Acorn 8.1 from the latest builds page.
There are two ways to toggle this behavior.
Via the Zoom palette: Use the “Scrub zoom” checkbox located right above the image histogram in the zoom palette (see the image to the right).
With a keyboard shortcut: This is for my friends who like to keep their fingers on the keyboard. You can press the ‘z’ key twice in quick succession to toggle the preference on and off.
And while I was diving into Acorn's zoom code, I cleaned up a few things and increased the maximum zoom level to 25,600%, more than double the previous limit.
And here’s a quick tip: Did you know that, in addition to pressing ⌘1 to zoom your image to 100%, you can also double-click the zoom slider handle to achieve the same effect?
You can download a preview of Acorn 8.1 from the latest builds page.
January 9, 2025
Dr. Drang and Allison Sheridan have some great tips for adding Retrobatch droplets to the Finder toolbar:
Allison Sheridan:
My main use of Retrobatch is to make featured images for blog posts that match what most if not all of the social media services will recognize. That droplet thing is money. I keep mine in the toolbar of Finder though since I always have a Finder window open when I’ve made a screenshot.
And Dr. Drang on how to do this:
So I moved the Trim Screenshot app [Retrobatch droplet] into the Applications folder and set about adding it to my Finder toolbar. I control-clicked on the Finder window’s toolbar, selected Customize Toolbar…, and dragged the Trim Screenshot icon into the toolbar, putting it to the right of the folder name. Since I keep a post’s screenshots in the same folder as the Markdown source, I always have that folder open when writing. It’s now a simple matter to drag the screenshot from the Finder window up into the toolbar and drop it on the Trim Screenshot icon.
I've been doing something similar when updating Acorn's documentation recently. I have a subfolder in my ~/Applications
folder named RBDroplets
, and in there I put a Retrobatch droplet, which is then added to my Finder toolbar. This particular droplet reads in an Acorn image and then spits out JPEG-XL, WebP, and JPEG versions of that file in the same directory the Acorn image is in. Then in my documentation I have a little scriptlet that looks like this:
<% pic(writer, 'preferences_images/prefs_registration', ['webp', 'jxl', 'jpeg'], width="800", alt="Registration Settings"); %>
And it creates a picture
HTML tag with the correct images and using the last element in the type array as a default.
Droplets are fun, but it's annoying if you save them to the Desktop or Downloads folder, macOS flips out and throws up a warning if the contents of it change in any way, and it's no longer a "trusted" app.
(Related: if anyone knows how to turn this warning off per app / bundle id, I'd love to know how).
January 6, 2025
I've got two app updates for you today.
Acorn 8.0.1 is out, which is of course about bug fixes that made it past the final release. It always happens, but this time around it was pretty mild. Release notes are avilable.
Retrobatch 2.2.1 is also out, with some RAW image performance improvements, and a couple of new features. Release notes for Retrobatch 2.2.1 are also avialble.
January 4, 2025
MinnMax: Panic, Playdate, Infinite Independence (YouTube link).
A video mostly about everyone's favorite indie Mac company, Panic. It came out a few months ago, but I somehow completely missed it. Cabel Sasser gives a tour of Panic's new offices, shows off their McDonald’s-inspired kitchen, and shows some interesting bits from their Panic History Archive room.
December 16, 2024
Acorn 8 has been released!
This is a major update of Acorn, and is currently on a time-limited sale for $19.99. It's still a one-time purchase to use as long as you'd like, and as usual, the full release notes are available. I want to highlight some of my favorite things below.
"Select Subject", "Mask Subject", and "Remove Background" are new commands which use machine learning (or A.I. if you prefer) to find the most important parts of your image, and then perform their respective operations. This has been a request for a long time, and while I was doubtful of its utility, it's actually pretty fun to play with and more useful than I figured it would be. So I'm glad I took the time to integrate it.
You can now set your measurement units to inches, centimeter, or pixels, and it shows up across the tools for your image, not just specific ones. This includes the crop palette, shape dimensions, filter settings… well, pretty much everything. This might be the oldest feature request I've implemented so far. And then related to this, Acorn 8 now has an on canvas ruler which you can use to measure out distances, straighten your image with, or even redefine the DPI.
Look up Table (LUT) support. LUTs are pretty fun, and they work by mapping one set of colors to another, enabling consistent or stylized visual effects. LUTs are used primarily in photography or filmmaking, and you can download and install new LUTs from various places across the internet.
Acorn 8 has the ability to read in a CSV file and it'll dynamically swap in the row values and replace text or bitmap graphics depending on what's in the data file. It's like mail merge, but for images. This is pretty awesome if you have a bunch of templated images you want to create.
Acorn has a new "Quick Processor". It's a quick version of the Shape Processor, where you can duplicate shapes, rotate, transform, and apply other operations to them. You can even use snippets of JavaScript to perform your own magic to shapes, including modifying anchors in bezier shapes. I hope to build a little library of cool JavaScript shape filters for this in the future.
OK, now for some geeky implementation things.
OpenGL has been completely removed and Acorn is 100% Metal. This was nice to do, and I was waiting to drop support for older versions of macOS first, but I'm glad it's finally happened.
Acorn's Shortcuts support has been completely rewritten (in Swift) to use the new App Intents framework. Hopefully this puts Acorn in a good place for the future as Apple adds more Siri integration to apps. Maybe someday you'll be able to say "Open up these selected images in Acorn, crop them to 4x3, convert to PNG, save and close them". That's the dream anyway.
Acorn's internal Bézier implementation has been reworked. This was one of the first things I did, as there was a mismatch in the internal API that Acorn used and what it presented on screen. In the classic implementation of Bézier curves you have a start point, two control points (cp1 & cp2), and an end point. But that's not how anybody actually uses when it's presented on screen. On screen you generally manipulate a single anchor, which can represent both a start and end point, and the previous curve's cp2 and the next curve's cp1. Not to mention all the nuances where a shape can have multiple continous paths, or maybe the path is closed or it's still open. So I wrote a fun shim on top of the Bézier "data" which became the new interface for Acorn's canvas to manipulate anchors and such. I backed it all up with tests and this new model ended up fixing some bugs and making the implementation cleaner. It's always a good feeling when you can get something done like that, even if it doesn't really change anything that the customer can see.
And finally, Acorn's documentation has been converted from custom RTF files (and an app named "FMWrite") into Markdown, which hopefully opens up the door for more integration with templates and services. I certainly have been enjoying seeing diffs between revision commits.
Of course there's more. There's always more. Make sure to tell a friend, and even check out the full release notes.
November 20, 2024
Axios: Apple is selling Apple News ads directly for the first time.
Apple has started selling its own advertising inventory for Apple News, two sources familiar with the effort told Axios. It's pitching new ad units that it hopes will maximize revenue for itself and its publishing partners.
I've been meaning to write about this for a while now, and since this popped up on Axios, now is a good time as ever. So here are my very brief thoughts.
I really think Apple should get out of the ads business, starting with the App Store. I find it corrupting, ugly, distasteful, and most of all an anti-premium experience.
Apple has always tried to project a premium experience, and seeing ads in the App Store just smells of desperation to me. I don't understand why they would want or need them? Apple already takes a commission from apps, and seemingly makes a good amount of money doing so. Having dumb ads feels petty and continues to breed a feeling that Apple will try to make any amount of money any time they can. Can't they just sit back and make something nice without squeezing every dime out of it?
And I, along with many other people, associate ads with tracking. Why mix that in at all when the App Store already has a reputation for harboring scammy apps?
I wish they'd just stop. It's damaging to their reputation and I don't think having ads is worth whatever revenue they are generating from it.
Via Daring Fireball.
November 11, 2024
I've just pushed up Acorn 7.4.6, which is about bug fixes. Specifically Quick Look fixes for MacOS 15 Sequoia (which was kind of tricky to get done right). If you're on Sequoia, make sure to grab this version.
I've also pushed up Retrobatch 2.2, which has a new CSV Reader node. It's basically like mail merge, but for images. Need to make 150 custom invitations as JPEGs for your Grandma's 100th birthday? This is a good way to do it.
Also included in RB 2.2 is a new node called "Dynamic Image Overlay" which makes it possible to insert unique (or randomized) images for each image processed in Retrobatch. When used in conjunction with the new CSV node, you can add custom images to your birthday invitations. Full release notes for Retrobatch 2.2 are available.
And finally, Gus Mueller 50.0 was released today. Besides pushing out software updates, this updated build will be heading to the climbing gym to ascend 50 routes in a day, about 2,500ft of vertical total. We'll see if he survives or not. Version 50.00273 might regret it.
Update:
51 climbs were completed. It was a mix of lead, top rope, and auto-belay. I really wanted to throw in some hard ones, but after about 20 climbs… well the smaller holds on even the 11s were getting tough to hold on to, even though I could pull overhanging jugs just fine. So only a single 12a was done, but it was my last climb and one of the hardest.
Also a big hug and thank you to Caitlin who took time off work to belay me today. It would have been 50 super boring auto belays without you.
Stats!
5.7: 6 top rope
5.8: 6 top rope, 1 lead
5.9: 11 top rope, 1 lead
10a: 4 lead
10b: 3 top rope, 2 lead
10c: 3 top rope, 4 lead
10d: 1 top rope, 1 lead
11a: 5 top rope, 2 lead
12a: 1 top rope
My watch says I burned over 1.8k calories, in two sessions, in about a total of 4 hours. I also ate a whole Detroit style pizza and a burrito. It was a good birthday, and I didn't fall on a single route.
November 6, 2024
Ken White (aka, Popehat): And Yet It Moves:
Trump won yesterday, as I feared he would. I firmly believe America — and likely the world — will get significantly worse for at least a generation, probably more. I’ll spare you, for now, the why. Frankly, I think you either already accept it or will never accept it. The things I care about, like the rule of law and equality before it, freedom of religion, freedom of speech, free trade in service of free people, relative prosperity, protection of the weak from the strong, truth, and human dignity are all going to suffer. Bullies and their sycophants and apologists will thrive.
What should we do?
I have a few thoughts.
November 4, 2024
So I didn't see that one coming.
John Gruber has a good take over at Daring Fireball: Pixelmator Acquired by Apple; Future of Their Apps Uncertain.
Acorn and Pixelmator came out 15 days apart from each other in 2007, and the target market between the two has always overlapped. But even with that I've always been on good terms with the Pixelmator folks. Any time we were both attending WWDC, we would meet up and complain about image APIs or just chat over lunch.
The other major player in this category is Affinity, which was purchased by Canva in March of this year. So it feels strange that Acorn is now effectively the only independent Mac image editor from that era.
I have no inside information on what Apple is going to do with Pixelmator. Will it be discontinued? Will it be part of an Apple One subscription tier? Will it be part of the OS now or folded into Photos? Was this purely a talent grab?
Time will tell.
But today I woke up, and got to work on Acorn. And I'll do the same tomorrow and the day after that.
I enjoy what I work on and I plan on doing it for many years to come. And I truly value my independence. I love being able to work on what I want, and when I want to.
Good things are happening to Acorn these days. I'm wrapping up some great new features in a new release, and if you'd like to test them out, let me know.
My iPhone Battery Life After a Year at 80% Charge Limit
Juli Clover at MacRumors:
With the iPhone 15 models that came out last year, Apple added an opt-in battery setting that limits maximum charge to 80 percent. The idea is that never charging the iPhone above 80 percent will increase battery longevity, so I kept my iPhone at that 80 percent limit from September 2023 to now, with no cheating.
…
My iPhone 15 Pro Max battery level is currently at 94 percent with 299 cycles
Via John Gruber, who's stats are "max capacity: 89 percent, 344 charge cycles".
I kept my phone at 80% charge for most of the year. There were 4-5 times where I let it charge to 100% for things like camping trips or long climbing days. I never cary an extra battery (with the exception of camping where there was no power), and I don't think I once depleted my phone. With a handful of exceptions, I charge via MagSafe.
My day one iPhone 15 Pro stats are: max capacity: 100%, cycle count: 229.
Am I an outlier, or are Gruber and Clover?