1作者: zonghao大约 1 个月前原帖
Two years ago, someone on HN shared an interesting ADHD hack: a tiny LED that blinks at 120 bpm and gradually slows to 60 bpm, supposedly helping your brain sync and calm down into focus mode.<p>I found Qiaogun&#x27;s implementation (ADHD_Blink) for M5StickC Plus, and adapted it for the newer M5StickC Plus2 with some tweaks - simpler 50% duty cycle flash, configurable ramp-down, auto sleep, etc.<p>Honestly, I&#x27;m not sure if it actually works. I&#x27;ll be trying it out myself to see. But the building process itself was quite fascinating.<p>I used Claude Code for the entire implementation - from reading the original codebase, to modifying the firmware, to flashing the device. There&#x27;s something surreal about an AI having full control over a physical piece of hardware.<p>It made me wonder: in the future, could AI-connected devices dynamically rewrite their own firmware based on user needs? Imagine telling your device &quot;make this button do X instead&quot; and it just... does.<p>Original HN comment: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=38274782">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=38274782</a> Based on: <a href="https:&#x2F;&#x2F;github.com&#x2F;Qiaogun&#x2F;ADHD_Blink" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;Qiaogun&#x2F;ADHD_Blink</a> Hardware: M5StickC Plus2 (~$20)<p>Happy to hear thoughts, or if anyone has actually tried this LED trick.
3作者: kwunnnn大约 1 个月前原帖
2
3作者: trissim大约 1 个月前原帖
I formalized the Single Source of Truth (SSOT) principle in Lean 4 (~2.1k LOC, zero sorry) and proved two core results:<p>Structural SSOT is achievable only when a language provides definition-time hooks and runtime introspection. Macros&#x2F;codegen (before definition) and reflection (after definition) are insufficient. These requirements are derived, not chosen: because structural facts are fixed at definition, derivation must occur at definition time and be introspectable to verify DOF = 1.<p>Would appreciate review, critique, or independent checking of the Lean scripts.
3作者: ahamidi_大约 1 个月前原帖
Runs local browser instances of Meta&#x27;s SAM Audio playground so you can isolate vocals&#x2F;drums from audio of any length without running SAM locally or hosting inference.<p>- Audio &gt;29s is chunked with ffmpeg<p>- Audio chunks and prompts are submitted in parallel to the playground via Playwright<p>- Web UI for storing tracks and re-editing previous outputs<p>Demo video: <a href="https:&#x2F;&#x2F;github.com&#x2F;user-attachments&#x2F;assets&#x2F;d5d3b53d-6ac9-40fc-9776-1afc7efbe4f4" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;user-attachments&#x2F;assets&#x2F;d5d3b53d-6ac9-40f...</a>