Jekyll2024-09-12T04:02:22+00:00https://www.lycarter.com/feed.xmlShutter ClicksSnapshots of my interestsLandon Carterme@lycarter.comRebuilding on AWS2024-09-11T00:00:00+00:002024-09-11T00:00:00+00:00https://www.lycarter.com/2024-09-11/rebuilding-on-awsPreviously, I’ve had this blog hosted in Netlify, which was great because (1) it was free, and (2), it offered Git LFS storage, which was great for storing all of the images, pdfs, etc, that this site has. Since this is a statically generated site, I don’t have a typical CMS system, and Jekyll doesn’t really have any first-class solutions for hosting media other than Git LFS. Unfortunately, Netlify decided to deprecate its Git LFS implementation, recommending users to use the Git LFS implementation that comes with their code host instead.

Unfortunately, Github’s Git LFS offering is…trash. It’s only up to 1 GiB of storage free, and 1 GiB of bandwidth free per month. This site is hardly anything, and it’s already 2 GiB - and even data packs are absurdly priced: $5/mo for 50 GiB of storage and 50 GiB of bandwidth. AWS offers 5 GiB free in S3, and it’s about 2c/GiB = $1/50GiB per month past that, with free data transfer in and negligible data transfer costs out (as long as the website is hosted behind CloudFront, not directly from S3).

So, not looking forward to the task of migrating all of my website infrastructure, I decided to go basically all-in on AWS. If AWS breaks its APIs or decides to deprecate things, it’ll at least have plenty of articles about how to migrate.

Current Setup

Hosting

I mostly followed this article about how to set up Jekyll hosting with Github Actions pushing ot S3 and Cloudflare - there’s nothing too interesting there overall, I basically just followed the instructions, with a few tweaks:

  1. The “Static website hosting” section is no longer part of the S3 bucket setup, it’s all the way at the bottom of the properties page:

  2. The Origin of the CloudFront distribution needs to be manually tweaked. You can’t just use the origin from the S3 bucket picker, you need to manually set the URL as such, otherwise you’ll wind up with a nasty xml error:

  3. My domain was managed through Google Domains, but is now managed through Squarespace. In order to get the SSL certificate set up correctly, here’s the DNS config I needed - note the precise position of underscores and dots:

Git LFS

What was a lot more interesting was setting up the Git LFS proxy so that I could back Git LFS files with S3. I don’t understand why there’s not an easy solution here - it feels like it would be a pretty common ask to have a cheaper setup, but apparently it’s quite uncommon? I guess a lot of folks just shift to using something else like Dropbox or an external database? I basically blame Git LFS overall. The more I read about Git LFS over the course of the project, the more I’m convinced that it’s a terrible hack that no one should ever use. Yet! We press on, because I only had one weekend of time to rebuild my website, and I didn’t feel like figuring out a totally different media solution.

It wasn’t too hard to find some people setting up S3 as a Git LFS target, but finding anything that did so in any remotely modern way was…challenging. All of the repos I found had some problem - they were archived years ago, or require you to run the proxy elsewhere, or a different elsewhere, or didn’t have any published releases or good install instructions. But…since it’s built on pretty stable tech, it’s not like anything really breaks too badly, right? I eventually settled on node-git-lfs as my localhost proxy, since I found this great article describing how to get it set up. It more or less worked out of the box, but it was giving me a bunch of scary warnings about AWS’s js SDK v2 getting deprecated literally tomorrow and it would stop being supported in September 2025. Since I was in the process of dealing with upstream service deprecations, I went on a little futureproofing side quest to hack-and-slash my way through an upgrade to the current v3 SDK (spoiler: the codemod command that’s included didn’t work at all). Since I am not a javascript dev, I wound up speeding up the process tremendously by getting Codieum to help out - it’s basicall a Github CoPilot alternative, and it worked extremely well when dealing with such a common language, small context window, and frequently-used SDKs - basically the perfect use case for LLMs as code assistants at the moment.

If you’d like to use a butchered-but-upgraded version, it’s on my github. I didn’t bother setting up builds (looks like the previous author used Travis CI), and I didn’t bother getting anything to run except for the one little config I personally use.

Gotchas

After that, there were a few details to iron out:

S3 requires exact filename access

My site had always run correctly navigating to https://lycarter.com/date/article, but the literal compiled files in the Jekyll _site folder are at /date/article.html, so I wrote a little script to rename all of the .html files to extensionless files. Unfortunately, that meant that S3 no longer understood what mimetype to serve the files with, and then Chrome would download them instead of displaying them. To solve that, I added another few lines to my deploy script to apply the content-type: "text/html" to all the files except those in the assets folder.

Running the localhost proxy in Github Actions is…weird

There may very well be a better way to do this, but I had trouble figuring out a convenient way to actually run the localhost proxy so that Github Actions could actually download the Git LFS files. Eventually, I wound up just starting the proxy in the background, sleeping 25 seconds to wait for the server to start up (testing showed it usually took about 18 seconds), and then issuing the git lfs pull request. It’s generally bad practice to hardcode timeouts like that for control flow, but it took me 5 minutes to do and it seems to work, so it’s Good Enough For Now™.

Improvements to be made

On the subject of improvements like removing the 25 second sleep from the deploy script, I noticed that running deploy a bunch of times during testing pretty quickly chewed through my S3 free tier request limit of 20k. I believe there are a few things contributing to that / things I could fix:

  • The deploy script downloads all ~800 files from S3 every time. There’s a few possible ways to address that:
    • If I only put the fullsize versions of the images in Git LFS and generated the thumbnails at build time, I could cut requests by a factor of 3.
    • If I made my Git LFS proxy zip them up before storing in S3, it’d be only 1 request per deploy (at the expense of being much harder to inspect / debug).
    • If I had the deploy step skip the images entirely, directly moving them from one S3 bucket to another, I could save a ton of time on the deploy step, and save some network calls.
  • The deploy script uploads too many things - because I fix the content-type after upload with an aws s3 cp, I incur extra costs. I could write a better sync script that uploads each of the extensionless files with the correct content-type set at upload time, and even better, could check whether the live-deployed site bucket has current versions of the files before uploading them.
  • I could probably put the LFS bucket behind CloudFront as well, so that Github Actions hits CloudFront instead of hitting S3 directly - most of the files don’t change, so it would probably work?

Ultimately though, I’m pretty happy with how things are now. The site seems stable, the development cost and the AWS cost per post is essentially zero, and AWS isn’t going anywhere.

Thanks for reading :)

]]>
Landon
The Drum Scanner Adventure2023-07-11T00:00:00+00:002023-07-11T00:00:00+00:00https://www.lycarter.com/2023-07-11/drum-scanner-adventureIntroduction

I introduced the last post with “Copper Pans are the Rolexes of cookware (silver pans are the Richard Milles)”. I think it’s safe to say that drum scanners are the Richard Milles of film scanners, and although I got one of the lowest-end models ever made, I think it’s safe to say that for large format scanning, it’s superior to literally every single non-drum-scanner option.

But first: what the heck is a drum scanner?

Film scanners are used to digitize film. They have a method to hold the film, a light source which shines through the film, and a sensor of some sort to detect the light passing through the film. There are generally about 5 categories of film scanners:

DSLR scanning

The film is placed on a light table of some sort, and you take a picture of it. Most setups will use a macro lens to achieve a higher magnification and a stand to keep everything nice and coplanar. This can be quite fiddly to set up, but the good news is that it’s very fast for single-shot images. This doesn’t scale very well to large format film, though - you either give up resolution by taking a single photo, or you have to take many photos and stitch them together later. This is popular since it doesn’t tend to require much additional equipment to get started.

Flatbed scanning

When I say the word “scanner”, this is the type of thing most people picture. Flatbed film scanners differ from flatbed document scanners only in that they have a light that shines from the top, through the film, rather than shining from next to the sensor to reflect off the document. Scanners capable of 35mm and medium format can be had for pretty cheap ($200 or so), but scanners which support large format start around $800.

Dedicated film scanners

There exist dedicated film scanners for 35mm and medium format, which frequently can take an entire roll at once. These work well for bulk scanning, and can frequently have higher resolution and performance than flatbed scanners. They tend to be quite expensive, especially some of the most popular older models. I believe these tend to operate similarly to shrunken-down flatbed scaners, although Nikon advertises that these ship with ED Glass, which indicates there may be a lens between the film and the linear CCD, possibly improving sharpness slightly.

Hasselblad Flextight / Imacon scanners

This is what you’ll find if you go to B&H and sort by price decreasing (always a fun activity) Edit: wow, seems like these were discontinued since the last time I did this search a few years ago. Now you find a Blackmagic 4k video film scanner instead. Anyway, these were advertised as “virtual drum scanners” and retailed for about $25,000. As far as I can tell, there are two key features that make these scanners special. First, they have an actual (really high quality) lens between the film and the sensor, rather than using a linear CCD directly the way the flatbed scanners do. Second, rather than scanning film while it’s flat, they intentionally curve the film, and as it rotates through the curve, scan only the line which is closest. Since film is relatively stiff, bending it intentionally in one direction pretty much ensures that it won’t flex inadvertently in the orthogonal direction (eg, the film forms a cylinder, not a pringle). One of the primary challenges of each of the methods described above is keeping the film flat - it likes to curl up, which means it tends to go out of focus if you’re trying to image it with a flat plane of focus. Flatbed scanners and DSLR scanners can make use of wet mounting to seal the film flat against a reference surface (generally, a piece of glass), but you usually have to get a dedicated film carrier for wet mounting and achieving accurate focus can still be a bit fiddly.

Drum Scanners

Finally, we get to drum scanners. Drum scanners have always been the highest end, highest performanc scanning option, with modern options costing significantly more than the Hasselblad Flextights. Frankly, I had a difficult time finding any actual sales listings - I think I remembered seeing a 1995 listing for $40,000 or so, and Ken Rockwell claims that Heidelberg drum scanners retailed for $100,000+. More typically, a photo house would own a drum scanner and offer to scan your images on a per-image basis, with price ranging from $35 - $500 per image scanned. Imagine paying the same price for a single scan that you would pay for an entire entry DSLR. What makes them so magical?

Drum scanners operate by having an acrylic tube (the drum) that the film is typically wet-mounted to. As you’ll recall from above, putting film on a cylinder mostly removes the possibility for warping that leads to out-of-focus scans. Wet mounting additionally helps to reduce the effect of scratches and dust by filling in all of the potential gaps that would adversely affect the scan, as well as to eliminate Newton rings (remember those back from high school science class, where you put two flat, transparent objects, almost in perfect contact and rainbow interference patterns form. By bridging the gap with a material of similar enough refractive index, wet mounting can basically eliminate this effect).

Once the film is mounted on the drum and the drum is loaded into the machine, a light is shone from inside to outside via a halogen light redirected through a light guide. On the outside of the drum, rather than having a linear CCD or area sensor the way all of the other methods do, drum scanners have a series of dichroic mirrors and photo multiplier tubes (PMTs) - this setup can sample individual points from the film with extremely high sensitivity. The drum spins to sample radially, and scans across to sample axially. Basically, the film is scanned in a very tight spiral, sampling individual points one at a time (the size of the point is determined mostly just by the aperture size, which ranges from 5 μm to 512 μm on my very low-end model). This is incredibly different from all of the other scanning methods, because PMTs can have a much higher sensitivity and higher dynamic range for better tonal reproduction (particularly useful in the shadows of slide film, or the highlights of negative film, where the amount of transmitted light is particularly low).

The Adventure

So now, we’ve established drum scanners are the absolute peak of scanning technology. Even the lowest end drum scanners from the early 90’s have better resolution than flatbed scanners (for large format, comparable resolution for medium format, and only marginally worse resolution for 35mm). They also have better tonal reproduction than any flatbed scanner ever made. My adventure begins with a harmless ebay search for “drum scanner, price <$2500”. I don’t even bother with a saved search for this, because there are usually fewer than 5 drum scanners listed on ebay at a time, and they’re almost always $5000+ with freight shipping or local pickup required. Well, a few months ago I happened to spot a listing for $1000, local pickup. I skimmed over it, because I still wasn’t really interested in paying $1000 for a film scanner of any variety, and it wasn’t in NYC anyway. However, I happened to see the exact same listing on Craigslist, and on Craigslist it was listed “for trade”. “For trade” is always intriguing, because I have a lot of camera-related cruft to trade. Furthermore, I realized that the “local pickup” was in Connecticut - not too far from NYC. I got in contact with the seller, and after a bit of haggling, settled on a trade deal: a working Howtek D4000, a broken Howtek D4000, two good drums, two scratched drums, and the computer setup to run the scanners, traded for a Meade 8” telescope, a Mavic Pro, and my old D300s with a couple of lenses. This was a particularly good trade, because the increase in volume of camera-related cruft I own would be minimized a bit by getting rid of the telescope - I realized after purchase of the telescope that no, even with the best intentions, NYC skies are too light polluted to see anything but the moon and occasionally the planets. Since the acquisition of the telescope, I had also acquired a 400mm f/2.8, which, when coupled with a 2x teleconverter, made a very capable telescope, with higher resolution (at lower magnification) than the 8” Meade scope.

With the trade offer secured, I convinced one of my friends to rent a car and drive up to Connecticut one Saturday in early June, with a couple of additional stops planned to go strawberry picking, eat some donuts in New Haven, and stop by Costco and Daiso on the way back.

That concluded the day’s adventures, but now I had two giant scanners, two G4 Macs, two monitors, and a large handful of cables, drums, manuals, and other accessories. My room/workshop was basically unusable until I built a storage solution, so I designed a stand that my shelves could go on top of, then glued some casters to the scanners so that I could wheel them in and out from under the stand. That worked incredibly well, they take up a minimum of critical space and can be easily wheeled around for use.

Powering On

After getting everything neatly tucked away, I took a break on the scanners for a few weeks to work on my workbench (it’s getting so close!), but after having to pause work on that to wait for some more supplies to arrive, a few weekends ago I decided it was finally time to turn the scanners on. I knew this wouldn’t be a perfectly smooth endeavor, because the seller had mentioned the original G4 Mac that he’d been using had died since he last used it, so he kindly and helpfully bought a new one. Unfortunately, the new one didn’t have a copy of the scanning software, and transferring the license from the dead Mac to the living Mac, as well as installing the software on the living Mac in the first place, sounded like a tricky task…

I spent quite a while fiddling with it, and was stymied by a few different dead ends:

  • When I tried swapping the power supply from the working G4 Mac to the broken G4 Mac, I realized that the broken G4 Mac was one of the only models to use a nonstandard power supply. I couldn’t swap the power supply from the working G4, and I couldn’t just drop in a modern ATX power supply. There’s a really helpful site that detailed how I could make an adapter from my ATX power supply to the broken Mac, but it would have taken a couple hours at least. I could have also purchased the adapter pre-made for about $50 (with $30 rebate if I sent in the broken power supply to be refurbished and resold), but I was intent on getting as far as I could within just the one weekend, and waiting for shipping would have stalled the project by a week. Nothing kills momentum more than not having the right parts on hand.
  • I also tried turning on the working G4, and even managed to get it onto the network and attached to my NAS! I bet that old machine has never seen a drive with multiple terabytes of free space, consider it formatted it in gigabytes…unfortunately, I also realized that this Mac was running OS 9.2, not even OSX. The oldest software download I could find for Silverfast, the scanning software, targeted OSX 10.3, and differences between OS 9.2 and OSX 10.3 were large enough that there was no hope of running the software on the older Mac. I could have tried to upgrade/reinstall with OSX 10.3, but didn’t really fancy my chances of getting that to work without burning a DVD, which I don’t have the capability for at the moment.
  • I explored the option of fitting the SCSI card into my modern PC and running the Windows version of the scanning software - it supported up through XP, and I figured it would probably work, but the SCSI card is PCI, not PCIE. Getting a PCIE SCSI card is a very expensive proposition, and again went against my goals to get everything working within one weekend.

Finally, the option that wound up working, after a bit of fiddling, was to swap the PCI SCSI card and hard drive from the broken Mac into the working Mac. I couldn’t get it to work with just the new hard drive, and had to swap the PATA jumper on the existing hard drive over to “CS” (cable select) mode in order to boot into the new hard drive. Figuring all of that out took at least an hour, but after a lot of fiddling, I was finally greeted with an OSX 10.3 machine, with Silverfast installed with a working license! Progress!

Now that I had the computer launched, the next challenge was to sort out the scanner <> computer connection. The D4000 connects via SCSI, and luckily a cable, PCI card, and SCSI terminator were all included by the seller. It took some more fiddling and a bunch of turning things off and on again to get the scanner and the computer talking to each other. What wound up working was to set the scanner IO to SCSI(4), and make sure that the scanner turned on before the Mac. After that, they connected!

I don’t have any of the wet mounting supplies yet, so I mounted an old darkroom print I’d made for the first test in reflective mode. Without wet mounting, I couldn’t get it super flat, but it was still a good test. I quickly learned not to cover up the calibration zone of the drum, as my first few scans had a lot of streaks from mis-calibration due to my tape extending into the calibration zone. After that, I kicked off a 1000 dpi scan, and off it went! The scan took a little while, but the result was incredible. I’ve always loved this particular photo, easily my favorite 35mm film photo.

Next Steps

I have a few modifications and upgrades I’d like to do. Firstly, I need a good way to actually mount the film for scanning - the trade came with a drum mounting station, but the rubber roller is completely falling apart, and would totally ruin the drum or film if I tried to use it. I need to figure out a way to replace that - I’ll probably try to DIY something with an industrial roller, but I did get in contact with a custom roller manufacturer to see if they can make something for me for cheap enough - they quoted $380 for qty 1, which is a bit steeper than I’d hope.

I also needed to acquire some drum-cleaning fluid, wet-mounting fluid, archival tape, and optical mylar. Aztek supplies all of those, but the complete package is $230, which is also a bit steep for what it actually is. If you look up the MSDS for the drum-cleaning fluid, you’ll see that it’s over 90% Naptha, and <4% n-Hexane, so I bought a gallon can of Naptha from Home Depot for $12.50. The MSDS for the scanner mounting fluid is <10% Mineral Spirits, <90% Naptha, and <4% n-Hexane, so I’m going to try just using Naptha, and buy some mineral spirits if it doesn’t work out. For the mylar, I got 3 mil dura-lar from Amazon for $11.50. I spent just a little while looking at tape alternatives, but the Kami tape is only $20/roll, so it’s not too insane.

I’m also interested in exploring some adapters that’ll let me connect the scanner to my modern computer rather than the old G4 Mac, so I can cut down on the extra bulky cruft. There’s a USB-to-SCSI adapter for $150, or a PCIE-to-PCI adapter for about $40, both of which are much cheaper than the actual PCIE SCSI cards, so I’ll probably start by trying those out.

Once I work out the film mounting situation a little better, I’ll go through my backlog of large format scans, and probably re-scan some of my better medium format photos as well. Once I have the process nailed down, I might try to start a side business offering drum scans for mail-in film or photographers in the NYC area. I know I have some friends that are interested - if you’d be interested in getting your film drum scanned, get in touch! My D4000 is only capable of 4000 dpi, but even at that it’s still better than any flatbed scanner’s true optical resolution. I’m already blown away by a 286 MP scan I made of an old 8x10 Velvia 50 photograph which wasn’t even properly mounted! I’m sure it’ll get even better once I have proper mounting supplies.

Thanks to my friend for driving us on the Connecticut adventure, thanks to the guy I traded with who was willing to strike an interesting trade deal, and thank you for reading!

]]>
Landon
Copper Pan Restoration via Electroplating2023-03-07T00:00:00+00:002023-03-07T00:00:00+00:00https://www.lycarter.com/2023-03-07/copper-pan-electroplatingIntroduction

Copper pans are the Rolexes of cookware (silver pans are the Richard Milles) - they’re beautiful and immensely functional. There are a few features that you want in an ideal piece of cookware:

  • High thermal conductivity to provide even heating
  • High thermal mass to prevent temperature fluctuations

Copper has a thermal conductivity roughly 2-3x that of aluminum, and about 25x the thermal conductivity of stainless steel (that’s why the best stainless steel pans are clad around an aluminum core, for better heat distribution). Copper also has a specific heat density about 1.5x that of aluminum and only a bit lower than stainless steel or cast iron, which means that it maintains heat about as well as an aluminum pan 1.5x as thick. Overall, this article does a good job of summarizing some of the pros and cons. Silver is the only superior material across the board, but is about 20x as expensive as copper, which itself is already almost 10x the cost of aluminum.

So, what is an engineer chef to do? I want the best (practical) performance, but new copper pans run about $300-600/pan. Additionally, thicker is clearly better for our thermal mass part of the equation, and modern copper cookware doesn’t tend to be as thick as some vintage pieces. This article from Vintage French Copper, a lovely little blog about collecting copper cookware, does a good job of summarizing the impact of thickness on collectability and usability. To that end, the answer I came up with was to purchase some vintage copper cookware online. I got a few pieces - a set of beautiful, thick saucepans from 12cm to 20cm, a small oval fish pan, a large shallow-walled frypan, a medium-sized Windsor, and a medium-sized saute pan. The saucepans were all nickel-lined, while the rest were all tin-lined - of those, the Windsor and the saute pan were in dire need of retinning, and I overheated the skillet on my very first time using it, melting the tin and ruining the surface (in retrospect, I wonder if this was properly tinned at all), which left me with 3 pans to attempt to repair.

Linings

We haven’t talked much about the linings for copper cookware - copper itself is reactive and would easily tarnish or leech copper into the food if you were to cook directly on the copper (especially with acidic foods). Therefore, outside of a few very niche applications, copper cookware is lined - there are a few options for what material they’re lined with, each with tradeoffs and benefits.

  • Tin: This is the most “classic”. It’s supposed to be fairly nonstick, and is a very thin coating which doesn’t impact the thermal conductivity very much. Unfortunately, tin has a melting point of about 450 °F, which is easily reached on the stove. If the tin lining becomes damaged, copper pots can be re-tinned professionally (doing this at home is a bit out of the question for an NYC apartment, due to the high temperatures and fumes).
  • Stainless Steel: This is a very common option on modern cookware - it’s easy to maintain, though almost impossible to repair if truly damaged. This can also impinge the thermal conductivity we value so much in copper cookware if the lining is too thick.
  • Silver: This is arguably the best option, but is quite rare, and generally carries all the same issues as pure-silver cookware: it’s crazy expensive.
  • Nickel: This is also fairly uncommon, presumably because it’s a bit harder to apply (industrially) than tin, and because some people have nickel sensitivities that make it unsuitable for use. However, it can be applied thinly, and is harder and has a significantly higher melting temperature than tin. In my opinion, if none of the people you regularly cook for have a nickel sensitivity, this is the optimal lining. As a bonus, nickel can be electroplated at home fairly easily.

Prepping the pans

That brings us to the bulk of this post - I had 3 pans in need of re-lining, and professional retinning services run about $80/pan, a bit steep when I’ve already paid about $100/pan on eBay. $180/pan is still an overall better deal than new copper, but with nickel as a lining option that can be applied via electroplating, I wanted to try restoring them myself.

The first step was to prepare the interior surface for plating. Plating on top of oil or other crud will immediately flake off, and plating more or less shows the surface finish of the underlying copper, so it was important to get a clean, smooth surface. I mostly used a palm sander, along with a dremel and hand sanding to get into the corners. This was by far the dirtiest and most difficult part of the process - there aren’t many good options for sanding the interior corners of pans and getting a good surface finish, other than the aforementioned hand-sanding, which is quite laborious. I’m really curious how professional retinning shops would have handled this - maybe they have some different dremel/grinder attachments?

I didn’t aim for a perfect surface finish, but I did try to finish with a palm sander by going through up the grits I had all the way up to 300. This left a surface that was smooth to the touch, but definitely not a mirror - I figured it should be similar to any of the aluminum nonstick or cast iron pans that I have, none of them are smooth as a mirror. Overall, this probably took around 2-3 hours per pan, plus cleanup time.

Electroplating

I did a bunch of research on nickel electroplating - there’s a fantastic handbook about the process that goes through all of the important information - electroplating solutions, voltage and current calculations, etc. It basically boils down to having nickel strips, for anodes, a power supply to supply voltage, and an acidic electrolyte solution of nickel. For the electrolyte solution, different additives, different nickel salts, and different acids all can effect the surface finish and efficiency of the process. These solutions can be purchased commercially, but in the interest of doing things as cheaply as possible, I elected to make my own solution from vinegar (acetic acid), which would produce a nickel acetate solution, and if that didn’t work well from a surface finish perspective, I could always re-sand the pans and order some commercial electroplating solution. From there, it’s basically just apply voltage in the right direction and you wind up with nickel on your pot. There are, of course, some process variables that will impact the result. Here are the steps I took:

Making the electroplating solution

As I mentioned, the electroplating solution can be puchased commercially, but I elected to make one by electrolysing nickel in acetic acid. I bought a 100g sheet of pure nickel online and cut it into a few strips to use as cathodes/anodes. To create the solution, I took about 1L of 5% white vinegar (the most generic type of vinegar you can get at Costco), poured in about a tablespoon of kosher salt, and applied current across the nickel anode + cathode.

For this, I used a lab power supply, since that was conveniently what I alraedy had, and modulated the voltage and current to keep the temperature of the solution to a reasonable level - I aimed for under 140 °F to keep the vinegar fumes under control, and wound up transferring the setup outside anyway. I started initially at a constant voltage of 30V (the max my power supply could deliver), and 1.9 A. As the nickel salts formed, the conductivity of the solution went up, so after about half an hour, the power supply hit the 3A constant current limit I had set. I modulated the current limit between about 2A and 5A to keep the temperature under 140 °F for a total of about 3h, reducing the anode mass by 9.797g and increasing the cathode mass by 3.059g (mostly due to dendritic nickel growth), for a total dissolved nickel quantity of 6.378g, forming a solution of about 0.1 molar nickel.

Applying the nickel

After the solution was prepared, the final step was to actually electroplate the pans. Normally the part to be electroplated is suspended in a jar of the electroplating solution, along with a nickel anode, but since I only wanted to electroplate the inside of the copper pans, and because they were so large, I elected for the pans to be the vessel itself. Essentially, I poured the electroplating solution into the pans, then suspended a copper anode in the solution without touching the pan. I made contact to the pan by clamping on the negative lead to the bottom of the handle, to ensure it wouldn’t leave marks in any visible locations. There were two main challenges I ran into while performing the electroplating: bubble formation and uneven current flux.

Hydrogen gas forms while electroplating, and the bubbles tend to stick to the pan, which leads to pitting and a generally uneven surface finish. At first, I tried lowering the current to prevent the bubbles from forming so quickly, and to agitate the solution manually to remove the bubbles, but eventually I just got a stir plate that I could set and forget. Interestingly, the stir bar left a mark on the finish whenever it was spinning, which luckily doesn’t affect the function at all.

The second challenge I had to work through was uneven current flux - electroplating thickness is directly proportional to current flux, and current flux is inversely proportional to resistance. In this setup, the resistance is basically proportional to distance, so adding in the standard 1/r^2 areal propagation, I’m pretty sure the coating thickness is proportional to 1/r^3 (where r is the distance from the anode). Basically, as you get further away from the anode, the amount of nickel being plated drops off dramatically. To combat this, I had to move the anode around to ensure different areas of the pan received roughly equal coating thickness - getting into the corners was particularly difficult, so I wound up wrapping an electrolyte-soaked paper towel around the anode, and using that to “paint on” nickel into the corners that were hardest to reach.

While electroplating, I tried to keep the current quite a bit lower than I had while making the electrolyte solution, to help form a more even surface finish. Typical currents are around 1-6A per dm^2, according to the handbook, and I calculated my Windsor pan to have an internal surface area of 4.36 dm^2. However, I kept the current under 1.5A the whole time, due to the uneven current distribution I previously discussed - I wanted to keep the peak current flux well below 6 A/dm^2.

In total, I electroplated for about 1.2 Ah, and the anode mass decreased by 1.963g, for a total estimated plating thickness of about 2.5 μm (based on current + time, I’m assuming the anode mass decreased by more than expected due to pitting and mechanical erosion). I haven’t used the pans much since plating, but the process was easy enough that if I notice this thin layer wearing away, I can always re-plate for longer.

I didn’t take as detailed notes for the saute pan, but it has an internal surface area of about 7.04 dm^2, and I electroplated it for about 4 Ah, for an estimated total plating thickness of about 5 μm. I didn’t take notes on plating time for the fry pan, but it was a similar order of magnitude.

Conclusion

After a couple of uses, I haven’t noticed the plating wearing off very significantly, but if I notice it wearing off over time, it’s pretty easy to re-do or add on to the existing electroplating.

This process has been way easier than expected, although the results definitely aren’t quite as beautiful as a professional re-tinning service. I feel like my pans perfectly suit my engineer chef perspective - they’re beautiful vintage French heirlooms in some sense, and wacky science project in another sense. My cookware is now beautiful and uniquely mine, so I’m thrilled with it. As always, thanks for reading! :)

]]>
Landon
Apple Pound Cake2023-01-31T00:00:00+00:002023-01-31T00:00:00+00:00https://www.lycarter.com/2023-01-31/recipe-apple-pound-cakeFrom Joanne Chang

12 servings

Ingredients:

  • 1 stick butter
  • 2 tsp vanilla paste
  • 3 tbsp heavy cream
  • 3 large eggs
  • 3/4 cup (150g) sugar
  • 1 1/4 cup (150g) sifted cake flour
  • 1/2 tsp baking powder
  • 1/4 tsp salt
  • 1 apple, peeled and thinly sliced

Directions:

  1. Preheat oven to 350 °F and line a 9”x5” loaf pan or 8” round cake pan with parchment paper.
  2. Melt butter in a pan, then add vanilla and whisk in cream.
  3. Whip eggs and sugar for 4-5 mins at medium speed.
  4. Sift together dry ingredients, then fold into the egg and sugar mixture.
  5. Slowly stir this mixture into the cooled butter mix.
  6. Pour into the selected pan, then shingle apple slices on top, and bake for 50-60 mins.

Joanne Chang (founder of Flour, one of my favorite Boston-area restaurants!) ran an Instagram series in the first Fall of the pandemic to promote baking at home (and sold bake kits for it…). She originally did this one in a rectangular loaf pan, but I decided that a full round pan with nicely-shingled apple slices wound be fun to make and look really good, plus I had some apples in October when I made this originall (10/6/20).

]]>
Landon
BBQ Pork Ribs2023-01-31T00:00:00+00:002023-01-31T00:00:00+00:00https://www.lycarter.com/2023-01-31/recipe-bbq-pork-ribsAdapted from Binging with Babish

12-16 servings

Ingredients:

  • 3 racks of pork ribs (10 lbs)

Dry Rub

  • 3 tbsp chili powder
  • 2 tbsp ground mustard
  • 1 tbsp garlic powder
  • 1 tbsp onion powder
  • 1 tbsp smoked paprika
  • 1 tbsp dried oregano
  • 1 tbsp salt
  • 1 tbsp black pepper
  • 1 tbsp white pepper

Sauce

  • 2 cups apple cider vinegar
  • 1 1/2 cups ketchup
  • 3/4 cup light brown sugar
  • 1/2 tsp paprika
  • 1/4 cup yellow mustard
  • 2 tsp Worcestershire Sauce
  • 1 tsp garlic powder
  • 1 tsp onion powder
  • 1 tsp salt

Directions:

  1. The day before, trim the silverskin off the pork and apply the dry rub liberally.
  2. For the sauce, whisk together the ingredients and simmer until syrupy (the consistency of cream, not honey - it will tighten up as it cools), about 1- 1 1/2 hrs.
  3. Preheat oven to 235-250 °F.
  4. Line a sheet tray with a bed of aluminum foil. In addition to that, wrap and crimp the ribs with additional aluminum foil.
  5. Bake for 2-2 1/2 hrs. (todo: what temp did the meat register at this point?)
  6. Remove the top of the aluminum foil wrapping around the ribs and bake for an additional 1 1/2 hrs, basting with the sauce every 20 mins.
  7. During the last 10 mins, crank the temperature to 500 °F to form a crust. (todo: what temp did the meat register at this point?)

I don’t remember exactly what led me to originally try this recipe on 7/9/20 - I think I was looking through Babish’s basics book and decided this looked tasty. It came out great on the first try, then a little dry on the second try - I tweaked the instructions at that point to 235 °F instead of 250 °F. The next time I try this, I’ll try to record all of the internal temps a lot more closely to maybe make a temperature graph. I’ll also probably consult r/smoking (bbq, not cigarettes), which has lots of great advice on low-and-slow cooking.

]]>
Landon
Oyakodon2023-01-31T00:00:00+00:002023-01-31T00:00:00+00:00https://www.lycarter.com/2023-01-31/recipe-oyakodonFrom J Kenji Lopez-Alt

4 small servings.

Ingredients:

  • 1 cup dashi
  • 1 tbsp soy sauce
  • 2 tbsp sake
  • 1 tbsp sugar
  • 1 large onion, slivered
  • 12 oz chicken thighs, thinly sliced
  • 3 scallions, thinly sliced
  • 3-4 eggs, scrambled
  • 2 cups rice
  • furikake

Directions:

  1. Mix together the dashi, soy sauce, sake, and sugar, and simmer the onion in the mixture for 5 mins.
  2. Add chicken, and continue simmering strongly for an additional 5-7 mins.
  3. Stir in scallions and season to taste, reducing to a bare simmer.
  4. Drizzle in the eggs so that they remain generally clumped together, and simmer an additional 1-3 mins.
  5. Serve hot over rice, and season with furikake if desired.

I discovered this recipe during the pandemic (first cooked on 5/19/20) when Kenji started posting POV cooking videos regularly. It seemed tasty and I generally keep all of these ingredients on hand, so it became something we cook every couple of months.

]]>
Landon
Peanut Butter Miso Cookies2023-01-31T00:00:00+00:002023-01-31T00:00:00+00:00https://www.lycarter.com/2023-01-31/recipe-peanut-butter-miso-cookiesFrom NY Times

Claimed yield: 18 cookies

Ingredients:

  • 1 3/4 cup (225g) AP flour
  • 3/4 tsp baking soda
  • 1/2 tsp baking powder
  • 1/2 cup (1 stick) unsalted butter
  • 1 cup (220g) light brown sugar
  • 1/2 cup (100g) granulated sugar
  • 1/3 cup white miso paste
  • 1/4 cup peanut butter
  • 1 large egg
  • 1 1/2 tsp vanilla extract
  • 1/2 cup (100g) demerara sugar

Directions:

  1. Whisk together flour, baking soda, baking powder and set aside.
  2. Cream butter, light brown sugar, and granulated sugar toegether.
  3. Add miso and peanut butter to the butter and sugar mixture, and stir an additional minute.
  4. Add egg and vanilla and mix until just combined.
  5. Add one third of the dry mixture at a time into the wet mixture.
  6. Roll out small balls of cookie dough, then roll those around in a bowl filled with demerara sugar to coat thoroughly.
  7. Refrigerate for at least 2h and up to overnight.
  8. Heat oven to 350 °F and bake for 15 mins.
  9. After 15 mins, take the tray of cookies out and tap against the counter to flatten and remove bubbles, then bake for an additional 3-4 mins.
  10. Let cookies cook for 3-5 mins before transfering to a cooling rack.

I wrote this recipe down on 9/3/20, but I think I may have tried them before that. They’re wonderfully chewy and savory. I could easily eat 3 in a single sitting.

]]>
Landon
Southern-style Baked Mac &amp; Cheese2023-01-31T00:00:00+00:002023-01-31T00:00:00+00:00https://www.lycarter.com/2023-01-31/recipe-southern-style-mac-and-cheeseAdapted from Basics with Babish

6 giant servings, 8-10 normal servings.

Ingredients:

  • 1 lbs short pasta, cooked very al dente
  • 1 can evaporated milk
  • 12 oz mozzarella, shredded or cubed
  • 12 oz cheddar, shredded
  • 2 oz parmesan, shredded
  • 2 eggs, scrambled

Directions:

  1. Cook the pasta to al dente, or just shy of al dente.
  2. Add the evaporated milk, half of the cheddar and all of the mozzarella, and stir until all of the cheese has melted.
  3. Once the pasta has cooled enough to not burn, whisk in the eggs.
  4. Transfer to a glass baking dish, top with the remaining cheddar and parmesan, and bake at 375 °F for 45 minutes until browned in some parts.
  5. Rest for 10 minutes and serve.

I tried a few different mac & cheese recipes before settling on this on 6/4/20. It’s still not quite perfect, it can wind up a bit grainy since it doesn’t use any American cheese as an emulsifier. I don’t love the over-the-top gooey mac & cheese made with a bechamel sauce, and this is a bit closer to the mac & cheese you’d normally find at a Southern barbecue as a side dish. It also doesn’t reheat super well in a microwave, though reheating in a pot with a splash of milk would probably work alright.

]]>
Landon
Adding custom tags for recipes to Jekyll2023-01-02T00:00:00+00:002023-01-02T00:00:00+00:00https://www.lycarter.com/2023-01-02/recipe-tagsMy friend Jiahui decided to organize a coworking session today, so I took the opportunity to do some work on this blog and put together a new page listing all of the recipes I’ve digitized. I also went ahead and split each of the posts that had multiple recipes into a post per recipe. You can find the list of recipes at /recipes, or from the top bar above.

This process was a lot more challenging than anticipated, because Liquid is only a markdown language, not a real language. Feel free to just check out the source or the commit 3f6ccb6 where I added the working version of the page. Essentially, I added new data to each page, and then grabbed that from each post to form a list of recipe tags. From there, I made another list with a list of each post containing that tag for each tag - basically, a map, but just done by matching indexes. It’s a bit hacky, but worked in the end. From there, I just customized the html and templating I’d written for the original /tags page.

In future coworking sessions, if I don’t have much to do, I’ll take the time to just write up some of the old recipes I’ve got written down but not yet digitized. Right now there are 20 written up, I have about another 20-30 written down in my notebook, plus another 10-15 in a recipe box my mom got me for Christmas in 2021. If anyone has any recipe requests, feel free to reach out.

As always, thanks for reading! :)

]]>
Landon
Links of the Week 11/152022-11-15T00:00:00+00:002022-11-15T00:00:00+00:00https://www.lycarter.com/2022-11-15/lotw-10Just a single link for this week, also from Tom Scott’s newsletter:

As always, thanks for reading! :)

]]>
Landon