No management needed: anti-patterns in early-stage engineering teams
The article discusses the concept of 'no management needed' and how companies can thrive without traditional management structures. It explores the benefits of empowering employees, fostering self-organization, and creating a culture of responsibility and accountability.
The Tulip Creative Computer
Tulip CC is an open-source cloud-based collaborative coding platform that allows multiple users to work on the same code in real-time. The platform supports a range of programming languages and provides features like real-time code editing, code sharing, and version control.
AI Generated Music Barred from Bandcamp
The article discusses the emergence of AI-generated music on Bandcamp, a popular platform for independent musicians. It highlights the growing trend of artists exploring the use of artificial intelligence in music creation and the potential implications for the music industry.
Games Workshop bans staff from using AI, management not excited about the tech
Games Workshop, the company behind the Warhammer franchise, has banned its staff from using AI in content or designs, citing concerns about the technology. The company's senior managers are not currently excited about AI and its potential impact on the business.
Are two heads better than one?
The article discusses the potential drawbacks of collaboration in game development, emphasizing that having multiple people work on a game doesn't necessarily lead to better results. It explores the importance of clear communication, a unified vision, and individual responsibility in producing a successful game.
How to make a damn website (2024)
The article provides a step-by-step guide on how to create a basic website, covering topics such as choosing a domain, setting up hosting, building the website with HTML, CSS, and JavaScript, and deploying the site.
Show HN: Ayder – HTTP-native durable event log written in C (curl as client)
Hi HN,
I built Ayder — a single-binary, HTTP-native durable event log written in C. The wedge is simple: curl is the client (no JVM, no ZooKeeper, no thick client libs).
There’s a 2-minute demo that starts with an unclean SIGKILL, then restarts and verifies offsets + data are still there.
Numbers (3-node Raft, real network, sync-majority writes, 64B payload): ~50K msg/s sustained (wrk2 @ 50K req/s), client P99 ~3.46ms. Crash recovery after SIGKILL is ~40–50s with ~8M offsets.
Repo link has the video, benchmarks, and quick start. I’m looking for a few early design partners (any event ingestion/streaming workload).
Scott Adams has died
Ask HN: Vxlan over WireGuard or WireGuard over Vxlan?
When traversing a public network. Let’s agree going recursive (WireGuard inside VXLAN inside WireGuard) is a bad idea.
Influencers and OnlyFans models are dominating U.S. O-1 visa requests
Apple Creator Studio
Apple introduces Apple Creator Studio, a collection of creative apps designed to inspire and empower artists, photographers, and content creators to bring their ideas to life on Apple devices.
Inlining – The Ultimate Optimisation
Show HN: Nogic – VS Code extension that visualizes your codebase as a graph
I built Nogic, a VSCode extension currently, because AI tools make code grow faster than developers can build a mental model by jumping between files. Exploring structure visually has been helping me onboard to unfamiliar codebases faster.
It’s early and rough, but usable. Would love feedback on whether this is useful and what relationships are most valuable to visualize.
Everything you never wanted to know about file locking (2010)
The article discusses the author's experience with setting up a backup system for their data, exploring the challenges and considerations involved in implementing an effective backup strategy, and highlighting the importance of a well-designed backup solution for protecting critical information.
Legion Health (YC S21) Hiring Cracked Founding Eng for AI-Native Ops
Legion Health, a healthcare company, is seeking an experienced SEO Specialist to optimize their website and improve organic search visibility. The role involves conducting keyword research, optimizing content, and implementing technical SEO strategies to drive increased traffic and conversions.
Choosing learning over autopilot
The article discusses the importance of choosing learning over autopilot in one's personal and professional life. It emphasizes the benefits of continually challenging oneself, staying curious, and embracing opportunities for growth and development instead of falling into routine or complacency.
Signal leaders warn agentic AI is an insecure, unreliable surveillance risk
The president and vice president of Signal, a messaging app, have warned that agentic AI systems are insecure, unreliable, and a surveillance nightmare. They argue that these AI systems pose significant risks and should be approached with caution.
Git Rebase for the Terrified
The article provides a beginner-friendly introduction to the Git rebase command, explaining its purpose, benefits, and practical steps for using it effectively to maintain a clean and organized commit history.
Open sourcing Dicer: Databricks's auto-sharder
This article discusses Dicer, an open-source tool developed by Databricks that automatically partitions and shards data to optimize performance and reduce storage costs in big data applications. Dicer aims to simplify the data partitioning process and improve the efficiency of large-scale data processing.
Superhuman AI Exfiltrates Emails
A bit more at https://simonwillison.net/2026/Jan/12/superhuman-ai-exfiltra...
Going for Gold: The Story of the Golden Lego RCX and NXT
The article explores the story behind the rare golden LEGO RCX and NXT bricks, which were created as part of a promotional campaign in the early 2000s. It delves into the significance and value of these sought-after collector's items within the LEGO community.
Show HN: An iOS budget app I've been maintaining since 2011
I’ve been building and selling software since the early 2000s, starting with classic shareware. In 2011, I moved into the App Store world and built an iOS budget app because I needed a simple way to track my own expenses.
At the time, my plan was to replace a few larger shareware projects with several smaller apps to spread the risk. That didn’t quite work out — one app, MoneyControl, quickly grew so much that it became my main focus.
Fifteen years later, the app is still on the App Store, still actively developed, and still used by people who started with version 1.0. Many apps from that era are long gone.
Looking back, these are some of the things that mattered most:
Starting early helped, but wasn’t enough on its own. Early visibility made a difference, but long-term maintenance and reliability are what kept users.
Focus beat diversification. I wanted many small apps. I ended up with one large, long-lived product. Deep focus turned out to be more sustainable.
Long-term maintenance is most of the work. Adapting to new iOS versions, migrating data safely, handling edge cases, and keeping old data usable mattered more than flashy features.
Discoverability keeps getting harder. Reaching users on the App Store today is much more difficult than it was years ago. Prices are higher than in the old 99-cent days, but visibility hasn’t improved.
I’m a developer first, not a marketer. I work alone, with occasional help from freelancers. No employees, no growth team. The app could probably have grown more with better marketing, but that was never my strength.
You don’t need to get rich to build something sustainable. I didn’t build this for an exit. I’ve been able to make a living from my work for over 20 years, which feels like success to me.
Building things you actually use keeps you honest. Every product I built was something I personally needed. That authenticity mattered more than any roadmap.
This week I released version 10 with a new design and a major technical overhaul. It feels less like a milestone and more like preparing the app for the next phase.
Happy to answer questions about long-term app maintenance, indie development, or keeping a product alive across many iOS generations.
Ask HN: Iran's 120h internet shutdown, phones back. How to stay resilient?
It has been 120 hours (5 days) since the internet shutdown in Iran began. While international phone calls have started working again, data remains blocked.
I am looking for technical solutions to establish resilient, long-term communication channels that can bypass such shutdowns. What are the most viable options for peer-to-peer messaging, mesh networks, or satellite-based solutions that don't rely on local ISP infrastructure?
Ask HN: Discrepancy between Lichess and Stockfish
I’m trying to understand a discrepancy between Lichess’s analysis board and my own Stockfish setup.
On Lichess (browser-based analysis), Stockfish reports close to 1 MN/s on my Redmi Note 14 Pro. However, when I run Stockfish locally via a Python program using the native executable, I only see around 600 kN/s.
What’s confusing is that despite the higher reported speed, Lichess takes about 2:30 to reach depth 30, while my local setup reaches depth 30 in about 53 seconds, even though it reports a lower N/s. Lichess also appears much more “active” in terms of frequent evaluation updates.
I suspect this has to do with how N/s is measured or displayed (instantaneous vs average), differences in search configuration (continuous search vs restarts, MultiPV, hash reuse), or overhead from the way the engine is driven (e.g., UI or I/O throttling). It also raises the question of whether “depth 30” is directly comparable across different frontends.
Has anyone looked into how Lichess reports Stockfish speed, or why a setup showing higher N/s can still take significantly longer to reach the same nominal depth?
A university got itself banned from the Linux kernel (2021)
The University of Minnesota has been banned from contributing to the Linux kernel after researchers submitted patches containing intentional vulnerabilities. The incident has raised concerns about the integrity of open-source software development and the need for robust verification processes.
Why Real Life is better than IRC (2000)
The article discusses the advantages of real-life interactions over online chatting, highlighting the importance of face-to-face communication, nonverbal cues, and the deeper connections that can be formed in the physical world.
Show HN: FastScheduler – Decorator-first Python task scheduler, async support
Hi! I've built this because I kept reaching for Celery for simple scheduled tasks and it felt like overkill. I just needed "run this function every hour" or "daily at 9am", not distributed workers.
So it's decorators for scheduling (@scheduler.every(5).minutes, @scheduler.daily.at("09:00")), state saves to JSON so jobs survive restarts, and there's an optional FastAPI dashboard if you want to see what's running.
No Redis, no message broker, runs in-process with your app. Trade-off is it's single process only — if you need distributed workers, stick with Celery.
Show HN: Self-host Reddit – 2.38B posts, works offline, yours forever
Reddit's API is effectively dead for archival. Third-party apps are gone. Reddit has threatened to cut off access to the Pushshift dataset multiple times. But 3.28TB of Reddit history exists as a torrent right now, and I built a tool to turn it into something you can browse on your own hardware.
The key point: This doesn't touch Reddit's servers. Ever. Download the Pushshift dataset, run my tool locally, get a fully browsable archive. Works on an air-gapped machine. Works on a Raspberry Pi serving your LAN. Works on a USB drive you hand to someone.
What it does: Takes compressed data dumps from Reddit (.zst), Voat (SQL), and Ruqqus (.7z) and generates static HTML. No JavaScript, no external requests, no tracking. Open index.html and browse. Want search? Run the optional Docker stack with PostgreSQL – still entirely on your machine.
API & AI Integration: Full REST API with 30+ endpoints – posts, comments, users, subreddits, full-text search, aggregations. Also ships with an MCP server (29 tools) so you can query your archive directly from AI tools.
Self-hosting options: - USB drive / local folder (just open the HTML files) - Home server on your LAN - Tor hidden service (2 commands, no port forwarding needed) - VPS with HTTPS - GitHub Pages for small archives
Why this matters: Once you have the data, you own it. No API keys, no rate limits, no ToS changes can take it away.
Scale: Tens of millions of posts per instance. PostgreSQL backend keeps memory constant regardless of dataset size. For the full 2.38B post dataset, run multiple instances by topic.
How I built it: Python, PostgreSQL, Jinja2 templates, Docker. Used Claude Code throughout as an experiment in AI-assisted development. Learned that the workflow is "trust but verify" – it accelerates the boring parts but you still own the architecture.
Live demo: https://online-archives.github.io/redd-archiver-example/
GitHub: https://github.com/19-84/redd-archiver (Public Domain)
Pushshift torrent: https://academictorrents.com/details/1614740ac8c94505e4ecb9d...
Show HN: Ever wanted to look at yourself in Braille?
What a year of solar and batteries saved us in 2025
The article discusses the benefits of solar energy and battery storage in 2025, highlighting how they saved the author money and reduced their carbon footprint. It provides a detailed account of the author's experience with solar and batteries over the course of a year, including the financial and environmental impacts.