(sorted alphabetically)
- Apache Beam
- Buildkite
- DBT
- Flink
- Scala
- Vertex AI (feature store, pipeline)
Solving hard software problems
History / Edit / PDF / EPUB / BIB / 1 min read (~81 words)- For problems related to environment setup, trying to accomplish the same thing on two computers can help identify disparities caused by the environment and not the code itself.
- If you are blocked when trying to solve your problem, define timebox how long you will continue attempting to address the problem before changing your approach.
- Every half-hour/hour ask yourself "Have I made progress in the last period?" If the answer is no, define a time limit before reassessing your next steps
Tools I use on a regular basis at work.
- Chrome
- Copilot
- Cursor
- Docker
- Git
- GitHub
- Google suite (Gmail, Docs, Meet, Sheets, Slides)
- Intellij IDEA
- LLMs (ChatGPT, Claude, Poe)
- iTerm2
- Jetbrains Toolbox
- Notion.so
- OpenLens
- Podman
- Postman
- PyCharm
- Rectangle
- Slack
- Stats
- Visual Studio Code
Daily
- 15m to document
- 1h lunch
Weekly
- 30m 1:1 with coworkers I work with on a daily basis
- 30m 1:1 with my manager
- 2 1h for continuous learning
Weekly readings - 2024-06-30
History / Edit / PDF / EPUB / BIB / 2 min read (~299 words)Gradually, then Suddenly: Upon the Threshold
I've recently started playing with Claude 3.5 artifacts for fun to prototype an idea I had and while I struggled to get it to do something "fairly" simple, I was impressed with the ease of prototyping compared to if I had to do it myself. I expect this to continue to improve, reducing the amount of time spent on setting up your development environment and getting more immediate results.
[...] I suggest that people and organizations keep an "impossibility list" - things that their experiments have shown that AI can definitely not do today but which it can almost do.
Innovation through prompting
Pretty exciting ideas on how to use LLMs to enable more dynamic teaching, even though it might not be perfect.
Algorithmic progress in language models
[LLM] Models require 2× less compute roughly every eight months
This would be 3 times faster than Moore's law (every 24 months). Similar to Moore's law, the big question is when we'll reach a plateau on those performance improvements.
How to Make Yourself Into a Learning Machine
A lot of things I've found myself doing over the years, mostly reading many books each year, keeping quotes (highlights) and notes from what I'm reading, using Anki to learn and remember languages and concepts/ideas, the use of Zettelkasten (but really, just the habit of writing down any thoughts into a digital note), etc. Definitely a recommended read it you're into personal information management.
Writing one sentence per line
A good way to make your writing clear and to the point. I also suggest to use easy words instead of fancy ones that people don't often use.