logs
I have developed a reasonably simple text-based log format to keep track of what I do day-to-day. These get parsed and turned into SQLite tables, then those tables get presented here in HTML.
- 2024-11-19 Tuesday, November 19th
- 2024-11-18 Monday, November 18th
- 2024-11-17 Sunday, November 17th
- 2024-11-16 Saturday, November 16th
- 2024-11-15 Friday, November 15th
- 2024-11-14 Thursday, November 14th (Stopping Practice)
- 2024-11-13 Wed, November 13
- 2024-11-12 November 12th
- 2024-11-11 Monday Nov. 11th
- 2024-11-06 November 6th, 2024
- 2024-11-05 November 5th, 2024
- 2024-11-01 Friday, November 1st
- 2024-10-31 Happy Halloween!
- 2024-10-30 Wednesday, October 30th
- 2024-10-29 Tuesday, October 29th
- 2024-10-28 Monday, October 28th
- 2024-10-26
- 2024-10-25 Friday, October 25th
- 2024-10-24 Thursday, October 24th
- 2024-10-22 Tuesday, October 22nd
- 2024-10-21 Analogue 3d pre-order
- 2024-10-15 Tuesday October 15th
- 2024-10-06 Sunday, October 6th
- 2024-10-05 October 5th, 2024
- 2024-10-03 Thursday October 3rd
- 2024-10-02 Wednesday, October 2nd
- 2024-10-01 Tuesday, October 1st
- 2024-09-30 Monday, September 30th
- 2024-09-27 Friday, Sep 27th
- 2024-09-26
- 2024-09-25
- 2024-09-24
- 2024-09-03 Nighthub
- 2024-09-02 Bookkeeping
- 2024-09-01
- 2024-08-31
- 2024-08-30 Bookkeeping
- 2024-08-29 Leetcode Day, Dynamic Programming Resesarch
- 2024-08-28 Leetcode
- 2024-08-27 Leetcode
- 2024-08-26 Leetcode Day
- 2024-08-25 Research: Priority Queue / Heap
- 2024-08-23 Bookkeeping
- 2024-08-22 Leetcode
- 2024-08-21 Leetcode, adaptive filters
- 2024-08-20 Another bookkeeping day.
- 2024-08-19 Leetcode Day.
- 2024-08-18 Leetcode Day.
- 2024-08-17 Redblack tree day
- 2024-08-16 Interrupting the Circle
- 2024-08-15 Leetcode Day.
- 2024-08-14 Leetcode Day.
- 2024-08-13 Leetcode Day.
- 2024-08-12 Leetcode Day.
- 2024-08-09 Day 82. I RECURSER PAUL BATCHELOR.
- 2024-08-08 Day 81. Never Graduate.
- 2024-08-07 Day 80. Some reading.
- 2024-08-06 Day 79. ALSA and CPAL troubleshooting.
- 2024-08-05 Day 78. Initial IT parser attempt.
- 2024-08-04 Day 77. investigating ft2play
- 2024-08-03 Day 76. c2rust, XM Players in Rust, More Coax
- 2024-08-02 Day 75. Silly idea. Project name: "Coax"
- 2024-08-01 Day 74. Presentation on Trio
- 2024-07-31 Day 73. Uploaded Trio to the Web. Balloon-ing in poke.
- 2024-07-30 Day 72. Trio Initial Web Port made
- 2024-07-29 Day 71. Week 11.
- 2024-07-28 Day 70: Trio Chord Selection Heuristics.
- 2024-07-27 Day 69. Audio Telephone Composing.
- 2024-07-26 Day 68. ChordSelector work.
- 2024-07-25 Day 67. Chord Selector Planning.
- 2024-07-24 Day 66. Initial Dagzet Rust Port Completed.
- 2024-07-23 Day 65. Voice Scheduler Feature Complete. Rust Dagzet nearly feature complete.
- 2024-07-22 Day 64. Haptic Knob Controllers.
- 2024-07-21 Day 63. Week 10.
- 2024-07-20 Day 62. My generics start to become unhinged.
- 2024-07-19 Day 61. Gestured Voices Work in Trio.
- 2024-07-18 Day 60. Staggered voices work in Trio.
- 2024-07-17 Day 59. Eventful Gestures and Dagzet
- 2024-07-16 Day 58. Impossible Day. (AGAIN??)
- 2024-07-15 Day 57. Event-driven Gesture, Dagzet, Testing in Rust
- 2024-07-14 Day 56. Week 9. Linear Gesture Builder for mnolth
- 2024-07-13 Day 55. A sine in tic80. Finally.
- 2024-07-12 Day 54. Trios, triads, and bytebeats.
- 2024-07-11 Day 53. All you need is NAND.
- 2024-07-10 Day 52. I do not understand how tic80 sound works.
- 2024-07-09 Day 51. Singing Drawing Tablets
- 2024-07-08 Day 50. Week 8. More TIC80 code exploration, unexpected Wacom Tablet, staggered voice algorithm
- 2024-07-07 Day 49. Mini drawing tablets, TIC-80 audio code
- 2024-07-06 Day 48. Setting up NeoVim with Kickstart
- 2024-07-05 Day 47. Poke Prototype Published
- 2024-07-04 Day 46. Connected poking sounds to poking visuals
- 2024-07-03 Day 45. Jitter, Summer 2s First day in Hub
- 2024-07-02 Day 44. Initial Tiny Creature Sounds, MIDI knobs, slightly more typescript, Zig
- 2024-07-01 Day 43. Week 7. Audio Troubleshooting, Moving Circles, Stateful Sliders in React and Typescript
- 2024-06-30 Day 42. (Something something Deep Thought)
- 2024-06-29 Day 41. Halfway.
- 2024-06-28 Day 40. Missing nodes. HTML/WASM export in tic80.
- 2024-06-27 Day 39. NeXT-ing. TIC80 investigations.
- 2024-06-26 Day 38. Thinkpad arrival. Space Jam.
- 2024-06-25 Day 37. HTMLize codestudy work, half-baked demos
- 2024-06-24 Day 36. Week 6. A Blob Named Bucket.
- 2024-06-23 Day 35. Sokol investigations.
- 2024-06-22 Day 34. Investigating 2d game engines. Thinkpad ordered.
- 2024-06-21 Day 33. Attempted better tract length control, "Milowda" chords
- 2024-06-20 Day 32. Tongue control in VoxBoxOSC, Agnus Dei
- 2024-06-19 Day 31. Happy Juneteenth!
- 2024-06-18 Day 30. VoxBox over OSC, more Chords.
- 2024-06-17 Day 29. Start of Week 5?
- 2024-06-16 Day 28. Chord progressions in vocal ensemble, studying VCV Potential code
- 2024-06-15 Day 27. Coarse tract size control, Linear Gesture Demo, Better Sounds in Chords demo
- 2024-06-14 Day 26. Compile Potential VCV Rack plugins, More Gesture Path
- 2024-06-13 Day 25. Gesture Path initial work, initial work on chords demo
- 2024-06-12 Day 24. Implemented rephasor. Hack the Planet!
- 2024-06-11 Day 23. Impossible Day.
- 2024-06-10 Day 22. Preset import. React tic-tac-toe.
- 2024-06-09 Day 21. Even more NaN-hunting. Initial preset export. Synthesized overtone throat singing.
- 2024-06-08 Day 20. More NaN-hunting.
- 2024-06-07 Day 19. Initial velum implementation. The NanHunt begins.
- 2024-06-06 Day 18. Post Game-Jam.
- 2024-06-05 Day 17: Game Jam?
- 2024-06-04 Day 16. Game Jam Day.
- 2024-06-03 Day 15. (Ooops I forgot it was game jam day)
- 2024-06-02 Day 14. Initial singing web demo
- 2024-06-01 Day 13. Moving day.
- 2024-05-31 Day 12. Picked up keys. Initial working vocal tract.
- 2024-05-30 Day 11. Glottis algorithm in Rust, thoughts about Tract
- 2024-05-29 Day 10. Glottis Scaffolding, some realtime audio in Rust
- 2024-05-28 Day 9. Rust code organization.
- 2024-05-27 Day 8. Memorial Day.
- 2024-05-26 Day 7. Recurse Wiki is born. Initial log generator plans
- 2024-05-25 Day 6. Saturday. Not too much work.
- 2024-05-24 Day 5 planning VoxBox, monowav in Rust
- 2024-05-23 Day 4. mono WAV writer in C
- 2024-05-22 Day 3. Laptop screen broken. What now?
- 2024-05-21 Day 2. Some initial system setups, various meetups
- 2024-05-20 Day 1. First day of recurse. Organize.
2024-11-19 Tuesday, November 19th
jump to top09:47: LC102: Binary Tree Level Order Traversal #grind75 #timelog:00:26:02
10:21: LC133: Clone Graph #grind75 #timelog:00:31:28
11:30: LC150: Evaluate Reverse Polish Notation #grind75 #timelog:00:34:10
14:05: nextjs #nextjs-dashboard-app #timelog:01:10:22
17:54: port some easy commands #dagzet-port-commands #timelog:01:04:21
21:23: DB internals (225-229) #reading-dbint #timelog:00:21:44
2024-11-18 Monday, November 18th
jump to topSick and in bed.
11:48: LC 542. 01 Matrix #grind75 #timelog:00:13:30
12:05: LC 973. K Closest Points To Origin #grind75 #timelog:00:09:45
12:21: LC 3. Longest substring without repeating characters #grind75 #timelog:00:24:19
12:53: LC 15: 3sum #grind75 #timelog:00:42:44
2024-11-17 Sunday, November 17th
jump to top08:47: 334: increasing triplet subsequence #leetcoding #timelog:00:45:10
I glanced at this one I think. I don't think I ever actually tested it.
Well, I didn't understand this problem when I first read it. A subsequence doesn't have to be consecutive. Before I looked at the answer, I tried making it a sliding window problem, and it failed the edge cases.
09:37: 692: top K frequent words #grind75 #timelog:00:31:54
Lexicographic order threw me off. It turns out the built-in structures in Python used to solve this problem (Counter, heapify) seem to handle this for you already? Was I supposed to know that?
Follow-up problem 347 Top K frequent elements. Apparently this follows a similar pattern: https://leetcode.com/problems/top-k-frequent-elements/description/
12:00: NextJS-ing #nextjs-dashboard-app #timelog:00:52:49
13:11: Watching a NextJS video on forms in Next.JS #timelog:00:20:46
14:36: LC347: top k frequent elements #leetcoding #timelog:00:17:32
This is a variation of top K frequent words
14:57: LC53: Maximum Subarray #grind75 #timelog:00:25:16
I did this already I think. But I didn't submit it, so it's a redo now.
Note to self: a subarray is different from a subsequence. Subarrays are continguous.
Wow, I did not get this right the first time. I gave up early and looked at the answer.
15:30: I have to redo the first 3 weeks #grind75
I didn't submit. So, I didn't really learn anything. yay.
15:31: LC57: Insert Interval #grind75 #timelog:01:18:50
Got something working, but I want to review this tomorrow
2024-11-16 Saturday, November 16th
jump to top08:53: Finishing up LC417 #grind75 #timelog:00:41:32
I'm hoping I just need to code up an edge case and that my overall solution is solid.
09:06: LC417: They wanted you to use BFS or DFS, not dynamic programming #grind75
My edge case isn't working.
09:14: I just bought a year of premium. #grind75
If I do 6 months, it'll be cheaper than the monthly rate.
09:36: Onto LC19 #grind75 #timelog:00:28:14
Remove Nth node from linked list
10:35: Examinining what's left of grind75 #timelog:00:18:40
I've been just tackling medium problems, and it looks like I'm wrapping up. We have 3 mediums left. Looks like I might be able to get them done by the end of the week.
10:55: LC1730 #grind75 #timelog:00:19:54
Shortest path to get food. Note that the time here is for implementation. Took me about 3-5 extra minutes to get the implementation, but I casually timed that one.
11:24: LC287 #grind75 #timelog:00:35:21
Find the duplicate number. Did brute force, then looked up the answer.
13:00: nextjs dashboard #nextjs-dashboard-app #timelog:00:23:35
14:47: more nextjs #nextjs-dashboard-app #timelog:01:11:09
2024-11-15 Friday, November 15th
jump to top10:32: Read up on 211 #grind75 #timelog:01:18:31
Ah. I was not making a Trie in my original implementation. I only kept track of possible combinations for each position, which was more efficience memory-wise, but it did not keep track of paths between unique words.
Example: "dig" and "dog" show up as (d (o (g)) (i (g))) There are two g's here. My structure represented the set as ([d], [o i], [g]).
If I were to add "mop" to the the structure. A trie would look like: ((m (o (p)) (d (o (g)) (i (g)))) My structure would look like: ([m d] [o i] [g p]) There's no way to tell if "mip" is a valid path or not
Even with pointers, "mog" could be valid with this path. ([m:[o] d:[o i]] [o:[g p] i:[g]] [g p])
12:38: NextJS dashboard #nextjs-dashboard-app #timelog:00:48:14
Just finished chapter 9. Chapter 10 will be another time.
15:51: Grind75 #timelog:00:21:45
Finally finishing up 211.
16:55: Grind75 #grind75 #timelog:01:07:04
417: Pacific Island Water Problem. I think I am close. It seems to lend itself well to a dynamic programming problem, but I turned up an edge case where I wasn't checking in all the available directions. Will look into that later.
22:16: Maybe a page of reading #reading-dbint #timelog:00:07:43
2024-11-14 Thursday, November 14th (Stopping Practice)
jump to top10:31: Grind75 #grind75 #timelog:01:27:44
211: Design and Search Words Data Structure. I think it's a Trie problem. Spent most of this time just getting the Trie structure the way I imagined it. Have not implemented the search yet, which I believe will be a DFS using a stack (for backtracking when a '.' is encountered).
12:56: 211 continued #grind75 #timelog:01:12:21
Ran out of time. Right data struture (trie), but my search doesn't work properly ('..' is a problem on the huge edge case).
14:23: wait wait, one idea to try. #grind75 #timelog:00:28:15
I'm breaking out too early.
Nevermind. I'm not getting it right. Time to stop.
16:38: Re-examine priority tasks #timelog:00:01:48
16:39: Completed? #flashcard-mvp
I mean, it's got a bug, but it works.
16:41: Typescript Handbook #read-typescript-handbook #timelog:00:24:07
Finally done with the reading.
18:40: database internals #reading-dbint #timelog:00:18:08
Ordering, operations, invocatoin/completion, shared memory, registers, safe/atomic/regular, consistency models
21:12: databse internals #reading-db-int #timelog:00:13:18
consistency models, single-operation, strict consistency (theoretical), all possible histories vs histories permissable under X, linearizablility, total ordering of events
2024-11-13 Wed, November 13
jump to top09:33: Grind75 #grind75 #timelog:00:15:00
Watching the neetcode video.
10:48: Grind75 #grind75 #timelog:00:41:52
36: Valid Sudoku
12:34: Grind75 #grind75 #timelog:00:13:00
49: Group anagrams
12:49: 152: Maximum Product Subarray #grind75 #timelog:00:51:16
152: maximum product subarray
14:51: Typescript Handbook #read-typescript-handbook #timelog:00:45:23
15:43: Bitrune web-demo: initial steps. #bitrune-web-demo #timelog:00:51:59
16:51: Quick daily log parser #timelog:00:17:07
2024-11-12 November 12th
jump to top09:45: Grind75 #grind75 #timelog:00:24:40
198: House Robber
10:37: Grind75 #grind75 #timelog:00:40:00
134: Gas Station
12:53: Grind75 #grind75 #timelog:00:39:44
134: Gas Station (Continued) #grind75
14:26: Grind75 #grind75 #timelog:00:41:26
More Gas Station (134)
17:35: Grind75 #grind75 #timelog:00:58:25
Gas Station, again (134). I thought it was dynamic programming. It was not. Another day wasted. I will have to look at this: https://www.youtube.com/watch?v=lJwbPZGo05A.
2024-11-11 Monday Nov. 11th
jump to top10:38: Grind75 #grind75 #timelog:01:03:56
I'm trying to grok this findMinHeightTrees problem better. Also, LC 739 (Daily Temperatures)
12:55: More Daily Temperatures (LC 739) #timelog:01:49:48
15:27: Even more Daily Temperature (LC 739) #timelog:00:43:06
16:24: Create daily timelog aggregate
17:40: Typescript Handbook #read-typescript-handbook #timelog:00:27:59
2024-11-06 November 6th, 2024
jump to topToday I am at the hub.
11:58: Grind75-ing #grind75 #timelog:00:19:18
Just want to get that last problem working (LC 310: minimum height trees)
12:53: Typescripting #read-typescript-handbook #timelog:00:03:27
14:21: Typescripting #read-typescript-handbook #timelog:00:31:56
2024-11-05 November 5th, 2024
jump to top10:03: Grind75 #grind75 #timelog:01:20:00
2024-11-01 Friday, November 1st
jump to top08:05: Grind75 #grind75 #timelog:00:24:30
11:01: Logging and admin
11:02: dagzet: more descriptive loops please?
11:31: my HTML renderer is crashing too quietly
14:05: typescripting #read-typescript-handbook #timelog:00:21:42
15:22: nextjs-ing #nextjs-dashboard-app #timelog:01:04:25
16:34: Create a timelog wiki page
19:37: Grind75 #grind75 #timelog:00:12:36
2024-10-31 Happy Halloween!
jump to top12:43: NextJS dashboard tutorial #nextjs-dashboard-app #timelog:01:13:45
14:37: Troubleshooting deployment #nextjs-dashboard-app #timelog:00:05:07
15:21: TypeScript handbook #read-typescript-handbook #timelog:00:15:33
2024-10-30 Wednesday, October 30th
jump to top14:25: NextJS and React #nextjs-react-guide #timelog:00:14:15
14:41: NextJS dashboard app tutorial #nextjs-dashboard-app #timelog:00:51:37
16:31: more NextJS dashboard-ing #nextjs-dashboard-app #timelog:00:59:07
17:44: Can dagzet tell me the unknown nodes?
20:10: Software dynamics #reading-software-dynamics #timelog:00:20:10
2024-10-29 Tuesday, October 29th
jump to top14:46: NextJS react foundations #nextjs-react-guide #timelog:00:55:53
2024-10-28 Monday, October 28th
jump to top10:52: Grind75 #grind75 #timelog:01:01:11
14:32: Grind75 #grind75 #timelog:01:08:29
15:33: Setting up the logs
Previous entries today entered now.
16:21: Typescript handbook #read-typescript-handbook #timelog:00:43:58
20:17: Grind75 #grind75 #timelog:00:03:57
2024-10-26
jump to top09:26: Grind75 #grind75 #timelog:01:07:57
15:42: Grind75 #grind75 #timelog:01:11:09
18:35: Grind75 #grind75 #timelog:00:20:40
2024-10-25 Friday, October 25th
jump to top10:54: Fix my timelog utility
it's not doing HH:MM:SS properly, the hours are rolling over.
11:26: Grind75 #grind75 #timelog:00:21:58
14:24: Grind75 #grind75 #timelog:01:38:51
16:44: Task curation
16:45: Future of this? #itparser-rust
It's kind of ad-hoc right now, and I don't have a defined purpose for this at the moment. Getting into schismtracker would be cool, and this could be a way to work with that data. But to what end?
16:47: this was crap #read-js-trivia
I'm sorry. I can only read so much bad writing. This had knowledge, but it was presented in a really unthoughtful way. I appreciated the references, but most of them were blog posts soft-summarizing the better official references. I'm just done with that.
16:48: Too unbounded of a task #frontend-mentor
And it's too design-y for me. I'd rather have a task be more defined like, "okay we want to get a look and feel like this app".
Looking at things like tailwind ui components @(dzref "webdev/tailwindui")!@ and Radix <<webdev/radix_ui>> are more of a targeted thing.
17:42: Software Dynamics #reading-software-dynamics #timelog:00:36:38
2024-10-24 Thursday, October 24th
jump to top10:05: Grind75 #grind75 #timelog:02:02:58
13:32: typescript handbook #read-typescript-handbook #timelog:00:45:25
15:00: Break
16:05: Tailwind components browsing
17:01: Found some NextJS tutorials/guides
I've seen this keyword flash up way more times than tailwind, though the tailwind ui components do look nice. Might be worth segmenting my time into.
17:16: NextJS and Vite seem to be very common
Seems like it's an either/or thing? Whatever, going to dip my toes into both.
Current job I'm looking at has NextJS/React/Typescript combo.
17:26: I have once again found myself on LinkedIn
I think I need to get off my computer now.
17:39: There's a flashcard duplication bug #flashcard-duplication-bug
This must have something to do with cache loading logic. Hopefully not too much of a nightmare to fix.
20:50: Database internals reading #reading-dbint #timelog:00:45:26
Finally finished part one. Started part two (distributed systems).
2024-10-22 Tuesday, October 22nd
jump to top10:20: Grind75 #grind75 #timelog:00:30:00
Pen and paper
11:14: Grind75 #grind75 #timelog:01:58:40
Coding it up
12:56: Wow this is taking a while. Ugh. #grind75
13:24: Finally. #grind75
Only took me 2.5 hours.
2024-10-21 Analogue 3d pre-order
jump to top09:25: Grind75 #grind75 #timelog:01:44:04
13:45: typescripting #read-typescript-handbook #timelog:00:35:10
17:20: Flashcard-ing
2024-10-15 Tuesday October 15th
jump to topOnce again, I find myself here in Brooklyn at the hub. Which is interesting only because I don't live here anymore. I've had an extended week trip to NYC from Boston. It's lovely to be back.
13:45: Setting up. Thinking about what to work on
13:53: Lunch
15:01: Pushing changes to website
15:05: Flashcards #flashcard-mvp #timelog:01:06:17
15:23: Some lingering design questions #flashcard-mvp
How and when to make sure new cards get imported? This is important because there can be situations where there are just enough cards in the metadata to fill the cache which ends up making the same cards being re-used over and over again.
When to recycle cards from level 5 back to level 1?
15:49: Trying to get the loading logic right #flashcard-mvp
My thinking is new cards should be loaded into the system automatically.
My current bucket system is flawed because it only kicks on when the cache is too small. I should have a pre-load before the caches are filled that update the database.
It's possible that flashcards can go missing if I delete a node. So far, this databse has been mostly append-only.
16:32: Break? What to do next?
16:40: Lost and Safe
2024-10-06 Sunday, October 6th
jump to top10:02: Grind75 #grind75 #timelog:01:37:11
Uh oh, I've just been running locally. I've been forgetting to submit stuff. I knew things were feeling wrong.
56, 981
12:46: Grind75 #grind75 #timelog:02:37:49
16:09: Typescript Handbook #read-typescript-handbook #timelog:00:44:28
Finished the Generics chapter. Variance annotations, covariance, and contravariance are taking a while to sink in.
17:14: See if I can re-implement 236 #grind75 #timelog:01:02:24
Even with the editorial in my face, it still took me an hour to implement. jeez.
20:57: Some boilerplate python code #flashcard-mvp #timelog:01:06:59
2024-10-05 October 5th, 2024
jump to top10:17: Grind75 #grind75 #timelog:01:23:34
39 (combination sum) is a backtracking problem that I am not understanding properly.
14:53: The grind continues. #grind75 #timelog:02:18:10
I did 46, poorly. 39 passed when I used a different data structure.
16:33: Flashcard planning #flashcard-mvp #timelog:00:45:32
21:22: Some typescript-ing #read-typescript-handbook #timelog:00:30:58
2024-10-03 Thursday October 3rd
jump to top12:58: Damn. I guess today I will get some stuff done.
The moment I woke up, this day was going wrong. I was really looking forward to just letting the entire day derail. Yet, here I am at the library.
13:00: Grind75 #grind75 #timelog:01:33:13
Per the recommendation of several people, I am going to just the Leetcode website to write code. It has the verification and eveything, and will help me pick up on weird edge cases.
14:46: typescript handbook: object types #read-typescript-handbook #timelog:01:14:31
2024-10-02 Wednesday, October 2nd
jump to top09:49: Grind75 #grind75 #timelog:01:03:30
Attempting that coin change problem again, among other things.
11:15: Mock interview with Dan
12:31: Lunch
14:53: Typescript reading #read-typescript-handbook #timelog:01:16:52
16:20: Database Internals #reading-dbint #timelog:01:01:09
2024-10-01 Tuesday, October 1st
jump to top11:01: Grind75 #grind75 #timelog:01:08:04
I hate when people talk in the library. I can't focus.
This graph problem isn't clicking for me yet (LC 207). Grr. Taking a break.
12:45: Refreshing myself on cycle detection in graphs.
Reading 7.9.1 "Finding Cycles" in the Algorithm Design Manual.
There's a mention of using back edges for an undirected graph by checking the ancestor, and it's mentioned in edge classification for directed graphs. Going to try and take another swing at this.
13:20: Grind75-ing #grind75 #timelog:01:50:13
15:26: typescript reading #read-typescript-handbook #timelog:00:59:00
17:10: Database reading. #reading-dbint #timelog:00:33:21
2024-09-30 Monday, September 30th
jump to top09:25: Grind75 #grind75 #timelog:01:30:00
12:16: This is kind of garbage #frontend-mentor
I've been trying to, in good faith, read through the the "javascript trivia" section. I wanted to make flashcards because it felt like a rote memorization. But like, most of the time, the questions aren't focused enough, and answers meander a little bit.
The nice thing about these questions is that there are references and resources. Most of these are stack overflow and blog posts, and while I've been saving some of these to read, I'm kind of done with the casual long-winded almost correct answers that come from blog posts.
12:19: Flashcards added to rust DZ parser #flashcard-mvp
I was trying to make some flashcards to test, and I realized making well-scoped and well organized flashcards from what I've been reading takes more work than making the MVP program that displays them. So, pausing on this until I figure that out more.
12:22: Typescript handbook #read-typescript-handbook #timelog:00:26:53
This just turned into more neovim troubleshooting. Sigh.
12:38: I'm back in the neovim config hell #read-typescript-handbook
I want to get tsserver working in my neovim setup on my box now.
12:50: neovim needs updating #read-typescript-handbook
I consolidated everything into one git repo, and it works on 2/3 machines. My box has nvim 0.9, everywhere else 0.10. I'm guessing that is the issue.
14:46: Trying to build nvim 0.10 apk from source
15:26: A good enough nvim config hack was made
APKBUILD route wouldn't work due to incorrect versions in deps. So I just wrote a little hack. Tooltips in the typescript LSP do not seem to work on my box.
15:29: Back to typescript. #read-typescript-handbook #timelog:00:28:41
16:00: Pairing with Dan
18:33: Done pairing. Dinner
2024-09-27 Friday, Sep 27th
jump to top10:30: Late start. Updating my website a little bit.
11:04: Thinking about flashcards... #flashcard-mvp #timelog:00:15:00
There's a lot of information I need to ingest, and I am starting to think having a system in place is going to be necessary.
I already have a data format figured out. It's built into my dagzet system. I'll need to get it working with the rust implementation, but that shouldn't take too long.
What I don't have figured out is a spaced repetition system. Something that allows me to go through a set flashcards and update their frequency depending on how I answer.
I imagine most of the heavy lifting will be done with SQLite. A small CLI (prototype in python?) could be used to present a set of (up to) N flashcards from a given set of namespaces, randomly selected from a weighted distribution based on priority.
Based on how I answer, priority for each flashcard gets updated.
13:41: Grind75 #grind75 #timelog:01:41:23
16:47: 3summing research #grind75 #timelog:00:47:50
What's tough here is the acceptance criteria. It needs to be 3 unique indices and the combination itself needs to be unique. Having the nested 3-loop is cleaner because the variables are in scope. Doing it with the backtracking approach requires cramming data in the recursive function.
dead end. Moving on
17:17: 3sum: looking at the editorial #grind75
The loops are moved around and hidden, but there still seems to be 3 distinct loops. The thing that's interesting (ish) here is that they are treating the problem as an extension of twosum, and re-using that logic, calling a version of twosum inside of 3sum.
17:25: wait wikipedia has a whole page on 3sum #grind75
I like their algorithm, which looks the two-pointer in the editorial.
2024-09-26
jump to top09:33: Grind75 #grind75 #timelog:01:25:41
Tasking this out now, as well as timing it.
542, 973
11:12: Going to try to implement quickselect in python
12:00: Off to lunch
13:14: typescript handbook #read-typescript-handbook #timelog:01:13:05
Reading: "Everyday Types"
14:32: JS trivia reading #read-js-trivia #timelog:00:41:10
This is a little meandering because the trivia has a ton of things I have no clue about and need to read about elsewhere.
There seems to be a lot of "coding for codings sake" in JS. I'm reading about IIFEs on wikipedia, and it seems to be one of those engineered construct attempting to overcome a weak language people are forced to write stuff in.
15:32: Frontend Mentor stuff #frontend-mentor #timelog:01:02:52
17:07: DB internals reading #reading-dbint #timelog:00:45:24
2024-09-25
jump to top10:28: Beginning Grind75
125, 242, 704, 733
11:31: I need to filter out easy problems
For the most part, they don't feel helpful enough. Turns out there's a button to filter these.
12:18: Frontend Mentor Diving
Going through the "Getting Started" bits on Frontend Mentor.
13:28: First challenge sort of done?
13:37: Back to Grind75
53, 57
14:56: How to run small chunks of typescript code?
This is the only way I'm going to be able to grok typescript.
Nevermind. Just typing code is good enough.
15:27: Currently just typing in typescript code
15:35: "typescript in 5 minutes" finished (in more than 5 minutes)
Now, onto reading bits of the typescript handbook.
15:56: Finished reading "the basics" from typescript handbook
16:16: Front End Interview Handbook time?
I've already peaked at the intor pages. Now, to begin the initial dive.
Starting with: "Trivia Questions"
uh oh. I don't know anything about javascript.
17:01: ouch my head hurts
Losing energy and focus. May Need another break.
17:18: Attempting to read more about database internals
I'm trying to do less notetaking, get the reading to be faster.
17:52: That's enough reading. I'm tired.
I managed to get about 30min in. And I finished up what ended up being a very academic chapter. I am now on "Log Structured Storage", which is familiar territory since I read abou tit in DDIA.
2024-09-24
jump to top16:51: Grind75 has literature to go with it. reading some of that.
Link https://github.com/yangshun/tech-interview-handbook.
Dagezt ref: <<prep/tech_interview_handbook>>.
Found <<prep/frontend_interview_handbook>> as well.
17:02: I've done a fair bit of LC questions from LC75, strategize next steps
I don't know how much time I'm going to give myself. I do feel a bit more confident about the leetcode stuff.
Not that I'm a bit more settled, I'm trying to get the hours in. It's tough being in a new place and settling.
I think what I need to do is be particular about which problems I work on, and why. Will follow up on this at some point.
17:04: Reading bits of the frontend interview handbook
I think I'm going to be going into more fullstack jobs, so this seems like something to read. I am going to read the table of contents and try ot get a sense for how long this book actually is.
17:09: Frontend Interview Handbook thoughts
There's a lot I don't know, but these feel like attainable unknowns.
17:22: Going to just manually write some notes on jobs
17:38: Walrus operator? Walrus Operator.
See: <<python/walrus_operator>>.
18:02: Well, the numbers are in.
I went to the jobs search page, filtered by "new york city", and hand typed the tags for all the jobs that showed up. I then wrote a python program to parse the data and give me count aggregates. JavaScript and TypeScript tied with 21 instances. The runner-up was react, followed by python.
18:11: The high-level conclusions.
flexibility on employability is going to involve a fair bit of front-end experience. My guess is that this is due to a number of these being startups that need a shiny looking product.
I think I need to pursue frontend. Learning typescript is a safe bet. Learning javascript is a safe bet. React is still a reasonable enough thing to be learning.
I think my full-stack learning path is going to be top-down. Starting with making pretty things in the browser, then gradually getting those pretty things hooked up to backend things. It's the most distant from my skillset right now.
18:21: The plan
Resources to look into first: <<prep/frontend_mentor>> as well as <<prep/frontend_interview_handbook>>. I want to get comfortable enough building conventional industry standard things.
I'm trying to see the forest from the trees. What exactly are we doing here? I don't need to "learn to code". I've been doing that. I need to learn to think differently about software. In my work, I touch computers differently than how web developers touch computers. So yes, I'm learning "tools" instead of "code" right now, but only to the point that I can understand how people approach computers.
Frontend UI stuff seems to have a language and grammar to it. They are the building blocks that go into building what we see as a "product" for "users". Tools in this ecosystem are ultimately built in the name of a commercial "product". Learning how to go along the grain of these tools will help be better form project ideas that are distinct as well as educational.
For example: A software synthesizer, in many ways, goes against the grain of a typical web app. It has a interactive visual vocabulary that was developed independently of the web. When you try to build a synth for the web the way you build a VST, you get something that feels out of place. My Trio demo feels out of place. It was built based on my experience designing for tablets and mobile phones. The only "webby" thing it has going for is that it has a URL. The rest of it blindly copies conventions from a different medium.
I have my qualms about web audio, but sound design and music composition are areas of distinction for me. I should probably use some of those chops for another web-based project. This next web project needs to go with the grain of the web zeitgest. It'll have a "frontend", not just a canvas. It will feel like a "product". It could have "users". There will be sound, yes. But that sound will be driven by some kind of data.
2024-09-03 Nighthub
jump to top19:27: Back to this IT parser? #itparser-rust
Do I still remember any Rust? I've got leetcode brainrot.
I never really finished this one. So, now I'm going to put some more effort into this one.
19:30: So many magic constants. yay. #itparser-rust
19:32: we are going to do it right now. #itparser-rust
going to load things into a struct, create incremental unit tests. Maybe just one test is all that is needed right now.
19:56: Well, now I am taking some steps back. #itparser-rust
Looking into burntsushi regex library in order to figure out how they set up tests.
20:08: Ugh. What a rabbit hole #itparser-rust
I feel like I temporarily avoided this problem before. Now, I'm going to address it slightly more head-on.
21:03: More cargo fighting #itparser-rust
21:07: We got a test working in main. good enough #itparser-rust
2024-09-02 Bookkeeping
jump to top15:04: bookkeeping
2024-09-01
jump to top11:00: Leetcode #LC75 #timelog:00:51:59
13:03: Leetcode #LC75 #timelog:00:38:23
2024-08-31
jump to top11:33: Leetcode #LC75 #timelog:00:27:28
13:28: Leetcode #LC75 #timelog:01:41:14
19:09: Leetcode #LC75 #timelog:01:10:58
2024-08-30 Bookkeeping
jump to top08:58: Bookkeeping
This leetcode crap is taking so much longer than I expected. Ugh. I spend the most time working on problems related to data structures and algorithms I never actually looked at. The most effective strategy is to put the time into doing some research and studying sample problems before attempting the leetcode problems.
09:27: Leetcode #LC75 #timelog:01:52:19
359, transcribed skiena string compare to Python
13:05: Leetcode #LC75 #timelog:02:20:29
1137, 746
16:23: Leetcode #LC75 #timelog:00:54:56
790
20:20: Reading: B-tree variants #reading-dbint #timelog:00:20:22
2024-08-29 Leetcode Day, Dynamic Programming Resesarch
jump to top09:38: Leetcode #LC75 #timelog:01:30:15
875, Reading about backtracking.
12:35: Leetcode #LC75 #timelog:01:08:15
14:06: Dynamic Programming Research #timelog:01:15:26
Reading Skiena on Dynamic Programming.
19:05: Reading #reading-dbint #timelog:00:39:07
2024-08-28 Leetcode
jump to top09:56: Leetcode #LC75 #timelog:01:34:19
374, 2300
13:20: Leetcode #LC75 #timelog:01:11:00
2300, 162, 875
19:47: Leetcode #LC75 #timelog:01:14:35
2024-08-27 Leetcode
jump to top10:06: Leetcode #LC75 #timelog:01:24:26
2542 (continued), 2462,
16:06: Leetcode #LC75 #timelog:00:26:35
19:40: Reading #reading-dbint #timelog:00:41:02
2024-08-26 Leetcode Day
jump to top08:59: Leetcode #LC75 #timelog:01:03:03
215, 2336, 2542
12:58: Leetcode #LC75 #timelog:01:28:42
2542
19:15: Reading #reading-dbint #timelog:00:40:43
2024-08-25 Research: Priority Queue / Heap
jump to top12:55: Priority Heap Queue Research/Study #priority-queue-heap-research #timelog:01:16:09
2024-08-23 Bookkeeping
jump to top08:30: Leetcode #LC75 #timelog:01:59:43
399
13:27: Leetcode #LC75 #timelog:01:49:02
399, 1926, 994
15:31: Bookkeeping.
2024-08-22 Leetcode
jump to top09:19: Leetcode #LC75 #timelog:01:10:45
547, 1466
14:01: Leetcode #LC75 #timelog:00:41:17
1466
18:39: Reading #reading-dbint #timelog:01:00:39
2024-08-21 Leetcode, adaptive filters
jump to top08:59: Leetcode. #LC75 #timelog:01:58:20
450, 841
13:57: Adaptive filtering #adaptive-filtering-research #timelog:01:36:18
19:55: Reading #reading-dbint
DBINT
2024-08-20 Another bookkeeping day.
jump to top08:57: Morning Bookkeeping
Adding in time logs up to this point.
08:58: Check-in #LC75
At this point, I have attempted about half the problems of LC75. I think I've chewed through most of the easy problems that I could do quickly. What remains are the sorts of problems I'll need to do more research into. I'm ready to slow down with these and change gears a little bit.
19:22: Small bit of reading #reading-dbint #timelog:00:11:09
2024-08-19 Leetcode Day.
jump to top09:47: Leetcode. #LC75 #timelog:01:21:30
1372, 236
13:36: Leetcode. #LC75 #timelog:01:01:33
199
16:06: Leetcode #LC75 #timelog:00:37:16
1161, 700
19:45: End of chapter 4, beginning chapter 5: Transaction Processing and Recovery #reading-dbint #timelog:00:58:42
2024-08-18 Leetcode Day.
jump to top10:22: Leetcode. #LC75 #timelog:00:23:33
437 (partial)
12:00: Leetcode. #LC75 #timelog:00:40:37
437 (continued)
15:20: Leetcode #LC75 #timelog:01:17:53
Even more 437.
19:10: Started Chapter 4: Implementing B-Trees #reading-dbint #timelog:00:41:35
2024-08-17 Redblack tree day
jump to top10:11: trees #redblack-trees #timelog:01:01:21
13:30: even more trees. #redblack-trees #timelog:01:20:35
15:41: AVL trees (Knuth) #redblack-trees #timelog:00:45:12
18:52: DB Reading #reading-dbint #timelog:00:32:20
cell layout, slotted pages, checksumming, managing variable-sized data
2024-08-16 Interrupting the Circle
jump to topYears ago, I helped a friend of mine with their art installation. One of the things I helped out with was assembling a wall of glass shards from handblown orbs. The instructions given to me were: create a circle of glue, interrupt the circle, and place the shard.
Interrupt the circle. That was what she said. Where did she come up with that? The phrase left an impression on me, and I still think about it.
I think a large chunk of my adult life can be described in terms of circle-building and circle-interrupting. I show up somewhere, build a small routine that I follow religiously, then intentionally break that routine. The circle I made during RC was satisfying to build, its edges were perfectly round, the grooves etched in deep. Now, that beautiful circle is ready for interruption.
I've been trying to figure out how to continue to log things after my batch, or even if I should log things. I'm slowly transitioning to "job-hunt" mode. Is it even worth talking about? I'm doing very boring very tedious things, starting with a leetcode speedrun.
I'm still logging time for long-term things, and that still seems worthwhile.
09:04: Logging things. #LC75
13:30: Read: An Algorithm For Passing Interviews. #read-algorithm-interviews #timelog:00:13:29
Summary: The "Algorithm": there are usually 8 different kinds of things you can use to solve a given technical problem: Hash Tables, Linked Lists, Binary Trees, Depth First Search, Sorting, Dynamic Programming, Recursion. By asking about the runtime and seeing what the problem is asking for, things can usually be reduced to about 2-3 options.
13:48: Read section on Trees and Graphs #read-cracking-coding-interview #timelog:00:07:34
This was a pretty short overview. Going to need to find more books/resources on the subject of Trees and Graph traversal. Off to The Algorithm Design Manual.
14:04: Trees and Graphs in TADM #read-TADM #timelog:00:55:41
15:10: Leetcode #LC75 #timelog:01:30:13
104, 872, 1448 (partially, going to revisit soon)
19:38: Leetcode #LC75 #timelog:00:23:14
1448. I gave up and looked it up. I was in the right direction with the recursive answer. C++ classes in the solution made it much easier. count was essentially kept as a global. When I didn't have to think about keeping track of that varibale in a recursive function, the problem and code got a lot easier to reason with. Lesson learned.
20:15: Started chapter 3: file formats. #reading-dbint #timelog:00:38:16
2024-08-15 Leetcode Day.
jump to top09:17: Leetcode. #LC75 #timelog:01:25:07
2095, 328
13:15: Leetcode. #LC75 #timelog:02:10:46
206, 2130
18:20: Reading. #reading-dbint #timelog:00:34:33
2024-08-14 Leetcode Day.
jump to top09:24: Leetcode. #LC75 #timelog:01:27:06
1657, 2352, 2390, 735, 394
15:33: Leetcode. #LC75 #timelog:01:01:32
933, 649
2024-08-13 Leetcode Day.
jump to top09:26: Leetcode #LC75 #timelog:00:27:55
643.
10:00: Read up on Prefix Sum. #LC75 #timelog:00:18:09
12:14: Leetcode. #LC75 #timelog:01:58:57
1456, 1004.
14:22: Leetcode. #LC75 #timelog:00:58:38
1493, 1732, 724, 2215.
18:58: Reading. #reading-dbint #00:24:28
2024-08-12 Leetcode Day.
jump to top08:50: Leetcode #LC75 #timelog:02:08:51
1768, 1071, 1431, 605, 345, 151, 238
12:51: leetcode #LC75 #timelog:02:11:21
334, 443, 283, 392, 11, 1679
19:36: B Tree Basics #reading-dbint #timelog:00:32:57
2024-08-09 Day 82. I RECURSER PAUL BATCHELOR.
jump to toplast day of batch.
I am overwhelmed. I don't have all the words yet, but I think I have a few. I think I might work on my reflection/retrospective. I've been really into the writing style "I MISS LORINA BULWER", so I think I might try to do something like that. Something along the lines of "I RECURSER PAUL BATCHELOR".
Reflections: retrospective.
08:37: Morning Triage.
08:38: Generate a page using basic HTML and a list #dagzet-ink
And then it'll be done.
08:41: Sunsetting. I got far enough on this. #itparser-rust
It's a good proof of concept for a proof of concept. I can return to it later, if I need it.
08:44: re-reading lorina bulwer. #retrospective
See: <<misc/i_miss_lorina_bulwer>>.
On second thought. This might be a little too intense for the RC crowd. But, I might give it a go anyways.
08:53: Writing some things down. #retrospective #timelog:00:45:15
I might turn this into tamer prose we'll see.
09:35: Publishing?
2024-08-08 Day 81. Never Graduate.
jump to top1 day left of batch.
09:22: Morning Triage.
09:26: Probably not going to finish this. #blob-coax
But I have a working demo that I think I can post today.
09:27: I've read enough of this book #read-elem-compsys
I think it'd be a good exercise at some point to go back and try working through the exercises. But, for now, I think I'm at a decent enough stopping point.
09:30: What's a good stopping point for this #itparser-rust
I think it would be nice to work this out in such a way that it exports JSON data for the patterns. Probably just low-level row information for now (Serde?). Later that can data can be analyzed by other programs to do things like extract notes and durations and so on.
09:50: Publishing.
10:00: Piano-playing and chats.
I'm trying to pick out some of "Scavenger's Reign", a show I just started watching. I really like this show!
10:51: start curating some ink #curate-ink #timelog:00:15:00
11:00: Lunch
13:30: More ink curation #curate-ink #timelog:00:52:26
14:23: Curated and uploaded. #curate-ink
It's too bad there isn't a good way to display this right now. I'll have to get an ink dagzet working at somepoint. Maybe.
15:00: Back home. Watched TV.
16:00: Presentations. Never Graduate Ceremony.
17:00: Party
20:30: Go Home.
2024-08-07 Day 80. Some reading.
jump to top2 days left of batch.
Feeling tired and groggy. Just like the weather.
08:51: Arrival. Triage.
09:00: Publishing.
09:11: Chapter 6: Assembler #read-elem-compsys #timelog:00:54:55
I read the chapter in one sitting. I imagine it's much easier to read about implementing an assembler than it is to actually implement one. At some point, I'll try to find time to actually work this all out.
Teach Yourself Computer Science (TYCS) recommends reading the first six chapters of this book. I have done this. I think this is good enough.
2024-08-06 Day 79. ALSA and CPAL troubleshooting.
jump to top3 days left of batch.
I'm feeling pretty raw today. I've been processing a lot of feelings. You know how when you lose sensation in your hand, and then the feeling comes back? Those initial pins and needles hurt, but you know it's a sign of progress and it'll pass. It's kind of like that. Things just hurt, but it feels like progress. I don't think I was feeling much before this.
08:32: Morning Triage.
08:43: Publish
08:47: Niceties
10:26: I think we're done with niceties.
Might go back and check at some point.
11:00: Lunch
12:00: Audio Hang
13:00: Troubleshooting Phone with PK
15:50: Snack
16:04: More stabs at the parser #itparser-rust #timelog:00:50:13
Got far enough to do some initial parsing of pattern data. Good enough for me for now.
2024-08-05 Day 78. Initial IT parser attempt.
jump to top4 days left of batch.
Friday will be the 0th day, because there are no days left after that.
08:38: Morning triage.
09:15: Want to see how hard this is to do #itparser-rust
I think I know enough Rust where the "it's in Rust" part isn't all that challenging anymore. There is one definitive spec file for IT.
09:26: Publish.
09:33: taking a look at ITTECH.TXT #itparser-rust #timelog:01:08:43
09:48: browsing IT files #itparser-rust
10:25: Found an interesting IT file. adding. #itparser-rust
10:29: Baby's first IT file. #itparser-rust
Just some sine pulses.
11:04: Okay. that was kinda fun. #itparser-rust
14:52: Attempts at an IT parser #itparser-rust #timelog:00:12:08
Didn't get far at all.
19:37: Another attempt? Another attempt. #itparser-rust #timelog:00:27:02
2024-08-04 Day 77. investigating ft2play
jump to top5 days left of batch.
10:30: Late Morning Triage.
10:59: Publishing.
11:03: Investigate ftplay again. #investigate-ft2play
Looked into this a while back. Going to give it another look today. <<trackers/ft2play>>.
11:12: Let's look into ft2play #investigate-ft2play #timelog:00:42:46
I want to see if I can get this working on my machine without any modificatoins. From there, my hope is to try and remove the SDL dependencies. Then, I'll re-assess the codebase and see if there's anything else I can do to simplify it.
11:18: Builds and runs! #investigate-ft2play
11:24: Studying the entry point. Isolate the wav renderer. #investigate-ft2play
I've also created a test file that renders and compares outputs.
11:31: I am making a Makefile. #investigate-ft2play
11:34: There's a README on how to make drivers #investigate-ft2play
It might be quite easy actually to rip out SDL.
11:37: makefile works. #investigate-ft2play
11:39: Working on a dummy driver #investigate-ft2play
11:40: Actually, before that, how does the wav rendering work? #investigate-ft2play
11:44: WAVDump_Record is the entry point #investigate-ft2play
The thread is now gone. The code comments actually planned for people like me.
11:47: Let's see if I can get this dummy driver working quickly #investigate-ft2play
Dummy driver surprisingly straight forward. Really seems unnecessary since we're just using WAV renderer. SDL2 deps are gone it seems. I've also flattened the directory a bit.
Much smoother than I expected it to be, really. Next up, I'd love to rework the command line so it just something like "ft2play in.xm out.wav". From there, import it into mnolth?
14:44: Improve the CLI of ft2play #investigate-ft2play #timelog:00:24:03
15:02: simpler main created, now to remove the unused code #investigate-ft2play
15:09: Let's turn on some warnings. #investigate-ft2play
I want to attempt to make the output reasonably clean if before inserting into mnolth.
Hey, not too bad these warnings. Not too bad, 8bitbubsy.
Actually, all those warnings introduced were mine. Oops. Very well done, 8bitbubsy. You are giving me a very enjoyable experience here.
15:17: Is it ANSI-compliant (AKA --std=c89)? #investigate-ft2play #timelog:01:24:21
Well, not it isn't. But nothing ever truely is.
15:22: Rolling up my sleeves #investigate-ft2play
Once I got past the C++ comments, the real stuff started to appear. Usual suspects.
15:53: pmp_main builds with --std=c89 now. #investigate-ft2play
16:02: trying to automate these comments a little bit. #investigate-ft2play
16:03: actually, ow, no nevermind. #investigate-ft2play
16:04: you know what. this ansification isn't worth it. I should stop. #investigate-ft2play
This is a waste of my time. It doesn't actually do much for me for portability.
16:06: dragging this into mnolth #investigate-ft2play
16:45: imported to mnolth. #investigate-ft2play
16:50: Add some colors #blob-coax #timelog:00:17:00
18:52: Finish chapter 5. #read-elem-compsys #timelog:00:30:22
2024-08-03 Day 76. c2rust, XM Players in Rust, More Coax
jump to top6 days left of batch.
10:40: Late morning triage.
11:05: Publishing
11:10: Taking a fresh look at this thing. #blob-coax #timelog:00:13:04
It really should aim to choose the closest pellet.
The only sound I have in mind right now are the sounds of placing pellets. One of those bubble popping sounds. Like the sounds on the switch.
The "consume and become larger" mechanic was an interesting ad-hoc idea. But it's Katamari Damacy. I should look into that game. They have a crazy soundtrack from what I remember.
I want to play with better colors here. Manic Japanese pastel colors. Think Kirby and Mario.
I think this game soundtrack should feel very tactile.
11:21: Doing some code cleanup. #blob-coax
11:22: I still want cheezy bossa nova somehow. #blob-coax
Maybe try to get this working in a tracker?
11:23: compiling c2rust #investigate-c2rust
Looks like you can just do "cargo install c2rust", which is pretty awesome if it just works.
LLVM issues. Installed llvm-dev, trying again
another error: libLLVMDemangle.a does not exist
11:30: Ugh. maybe a dead end. #investigate-c2rust
11:34: oooh. there's an llvm17-static #investigate-c2rust
Let's see if this works.
Restarting, then trying again. The library does seem to be there.
11:42: libLLVMTestingAnnotations.a missing now
I found an arch page that has it in LLVM18. I installed LLVM17. Let's see if that does anything.
11:52: Same deal. Giving up. #investigate-c2rust
The most promising command so far:
#+BEGINCONFIGSRC
13:51: I don't need c2rust afterall! #investigate-c2rust
Someone already made it. See <<rust/crates/xmrsplayer>>.
13:55: Testing xmrsplayer #investigate-xmrsplayer
See: <<rust/crates/xmrsplayer>>.
Well it builds, but once again I run into the usual snags with CPal and ALSA.
People, add a fallback offline wav writer. Jeez.
14:03: There's a "bufferedsource" example #investigate-xmrsplayer
Wonder how hard it would be to work that into a my monwav voxbox writer.
14:07: Create a new xmplayer project in my scratch #investigate-xmrsplayer #timelog:01:07:06
Did not know you coul do the thing with "cargo add xmrsplayer". That was so easy.
But ah yes, now the infamous build time.
14:20: sidetracked: nectarine in mplayer?
It works!
14:27: back to it now. #investigate-xmrsplayer
14:58: Some sound, but it's very faint. #investigate-xmrsplayer
15:05: Clicks. Clicks in the output. #investigate-xmrsplayer
I am not getting a warm feeling about this one.
15:07: Trying another XM file (acid-jazzed evening) #investigate-xmrsplayer
No clicks, but this one is out of tune. Also very soft gain.
15:11: Nevermind. #investigate-xmrsplayer
Not worth the time investment here.
15:36: Playing my thingy while listening to bossa-nova music #blob-coax #timelog:00:20:05
wii music: good vibe. something about the timing
getz/gilberto girl from ipanema: there's something more playful about this pairing. I think it's cuz there's a bit more of film cliche happening.
Severance OST Theodire Shapiro "Labor of Love": The triangle makes this very funny. It's definitely a matter of tempo.
wii shop: ~170 BPM, ipanema: ~140 BPM, severance: ~125BPM.a
Desafinado Gilberto/Jobim/Getz ~140 BPM. This one also is very playful juxtaposition.
15:56: rework blob to go over closest #blob-coax #timelog:00:08:38
16:41: Got sidetracked trying to build Bun on alpine. #blob-coax
Didn't actually get anything done. Heading out.
19:37: Back to trying to get "eat closest" working #blob-coax #timelog:00:33:01
Had a nice high-level chat with RB about a possible implementation. He called it an "object pool", which is pretty much exactly what I'm going after.
2024-08-02 Day 75. Silly idea. Project name: "Coax"
jump to top7 days left of batch.
08:30: Play piano
09:00: Chats at Table
09:27: Catching up on logs
09:34: I want to try a mini bucket named blob #demo-blob #blob-coax
So, poke is really bumming me out. It's too much abuse.
Using no sound, just circles, to try out this initial idea.
There is a big white circle on the page. You click to place other smaller black circles (pellets). The circle waits, then moves to each circle to "eat" them, then repeats the process until there aren't any other circles.
10:09: retrospection #demo-poke
Yesterday I showed another person my Poke demo [0]. Another very visceral response: "why did you make me do that to him?"
On the one hand, I guess I should be pleased that I am getting strong reactions to my art. On the other hand, no, not like this :(
I find myself agreeing with the reactions. It's very hard to make the "poke" action feel consensual. Even a "tickle" is an invasion. I wanted to build something whimsical, but I didn't really consider the implications of the mechanic. No matter how cute or whimsical the design is, the poke negates all of this.
I feel a little more deflated than usual about this. With "Poke", it sometimes feels like I've accidentally added a bit more cruelty in the world, which was the exact opposite of my intention.
11:05: Finally publishing.
12:45: Inking out some ideas. #blob-coax #timelog:00:22:24
13:19: Attempts to code up the ideas in Canvas #blob-coax #timelog:02:27:57
13:35: We have a circle. #blob-coax
But, weird stroke bug in firefox. The line has variable thickness, and some dots. In chromium, this is a non-issue.
14:10: Trying to get an initial journey mechanic working #blob-coax
14:16: trying to get a constant velocity now #blob-coax
Somehow need to convert times to be pixels per second.
14:30: Inking some ideas #blob-coax
14:44: Attempting to implement those ideas #blob-coax
14:55: I might not need trig actually #blob-coax
It's traveling on a line, so maybe just the line equation. It can be a normalized slope velocity too. Something to add to a point until it reaches a target.
15:01: I'd like to add different movement behaviors #blob-coax
So, calculate progress, and then use an easing function to make it so it can slow down or speed up to a target.
15:13: behavior... kind of? #blob-coax
It's not quit what I wanted it be. I'd need to rework it to make it work with easings. The current implementation just calculates an relative trajectory, and I'm adjusting how fast that trajectory goes based on the position. interpolating between two points would be better.
15:46: I got behaviors the way I want now #blob-coax
Behaviors now have different interpolation schemes. They are also randomized now, and can have randomized speeds.
18:18: Insert more pellets, eat pellets #blob-coax #timelog:00:36:16
18:47: wait timer, get larger #blob-coax #timelog:00:53:44
18:56: Blob should get bigger everytime there are pellets #blob-coax
When they get too big, there should be a forced pause, and the Blob should shrink again.
19:23: If pellets aren't around for a while, digest #blob-coax
19:30: It should begin jiggling at the larger scales #blob-coax
Like it's about to burst.
2024-08-01 Day 74. Presentation on Trio
jump to top8 days left of batch.
Demo: http//pbat.ch/recurse/demos/trio.
Yeah, this was a bit of a log-free today. Doing the presentation was surprisingly draining. I swear, I used to be better at this.
09:00: Slides and Prep
11:34: Lunch
12:51: Resume Stuff
16:00: Presentations
2024-07-31 Day 73. Uploaded Trio to the Web. Balloon-ing in poke.
jump to top9 days left of batch.
I stayed up too late again. I am tired. I did try sleeping in a little bit though. Going to have to pace myself.
I have just signed up for a presentation slot. So, I gotta enough working for that today.
08:34: Morning Triage
09:00: Hoping to improve chord selection today #demo-trio-chord-improvements
It chooses chords too quickly (when the lead changes), and should only think about changing chords when the lower/upper notes happen. I had a concept of "committing" to a chord. A faster thing might be to just move the chord changes to when the lower note triggers rather than the lead.
Also, I want tonic bias. Every 4 non-tonic chords or so, there should be a switch that puts tonic as priority, it if it is available as a candidate.
09:25: Publish
09:27: Attempts to get better chords working. #demo-trio-chord-improvements
09:48: The quick fix makes things sound MUCH better now. #demo-trio-chord-improvements #timelog:00:18:34
I've just been noodling on this thing for the past 15 minutes.
09:52: Tweaked the tract lengths too #demo-trio
09:55: revisit mobile, add smoothing, and other things. publish #demo-trio #timelog:01:15:28
10:02: shoot I am having too much fun with this thing it's working it's really working #demo-trio
That's how I know this is close.
10:04: Okay for real, smoothing. #demo-trio
10:18: My phone is almost definitely running at a higher sampling rate #demo-trio
I want to see if I can get the webaudio API to request a sampling rate (44.1kHz) because the voices are just going to sound better.
I am now coming to understand why gnuspeech decided to go the route of resampling from 16kHz to the host sampling rate. It's just going to sound better that way. The changes sound too weird.
10:46: Getting caught up in tweaking the "ooo" vocal shape #demo-trio
I'm finding it's very hard to find in my interface, because the sliders end up being very close to zero. I almost feel the urge to have a min/max slider too to be able to "zoom in".
11:04: Reqeusted sample rate of 44.khz. Sounds better now #demo-trio
11:08: Time to upload this thing. #demo-trio
11:14: Uploaded. #demo-trio
12:55: Safari troubleshooting. #demo-trio #timelog:00:11:13
13:30: Talking about chords and practice apps with RN
14:31: Balloon! Initial port attempt. #voxbox-balloon #timelog:00:51:52
14:42: I'm confused. I thought balloon took an input. #voxbox-balloon
Oh wait, I made presure the input, which you would vary with an envelope. That's amazing. Good idea, past me.
15:01: Test out this balloon. In chatter example? #voxbox-balloon
15:25: Attempts to hook up balloon to poke #poke-evolving-sounds #timelog:00:32:52
15:42: Our little guy keeps rising, but not falling. Balloon issue? #poke-evolving-sound
15:59: Okay some evolving sounds now. #poke-evolving-sound
16:28: listening do guinea pig noises on freesound for inspiration #poke-laughter-chitter
16:49: I'm not hearing much of a difference for poke #poke-evolving-sound
I think it's time to move on.
It takes way too much time to try out sounds in the webaudio workflow I have. Having to recompile the wasm, go to the web browser, refresh the page, and have it maybe not reload because browsers can sometimes be weird about WASM stuff specifically, was apain.
If I were to do this again, I'd build a native version before getting it working on the web.
16:57: Picking up where I left off. #resume-follow-up #timelog:00:51:09
I am putting it in google docs and fitting it manually into the resume template. it is slow.
2024-07-30 Day 72. Trio Initial Web Port made
jump to top10 days left of batch
Prev: attempts to port to web, some more chord manager ideas.
Next: demo-trio-web: Get an initial prototype working, basically.
07:50: Morning Triage
08:01: Just do the graphics in canvas instead of p5? #demo-trio-web
R mentioned this, and it's been a sentiment that others have mentioned as well. p5 just makes canvas calls anyways.
08:06: Publish.
08:07: Get something working using canvas API #demo-trio-web #timelog:01:34:07
08:56: Trying to get pointer logic working. #demo-trio-web
Following <<webdev/using_pointer_events>>.
Issue was a chrome mobile related issue. Had to use CSS to set touch-actions
to be none
.
09:35: Setting up dynamic lines layout #demo-trio-web
09:43: Okay visuals are off to a good start now. #demo-trio-web
09:48: Begin work making audioworklet stuff #demo-trio-web #timelog:01:02:25
Once again...
09:50: turning things into a module #demo-trio-web
Interesting, the parser becomes stricter. I need to formally declare "i" in my for loop. All done by simply add "type=module" to my HTML script element.
10:13: click to start mechanics not behaving for some reason #demo-trio-web
Figured it out: it was the callback mechanics of requestAnimationFrame.
10:16: adding "TAP TO BEGIN" text. #demo-trio-web
10:27: setting up message passing to worklet #demo-trio-web
10:30: first getting the input events to the node #demo-trio-web
10:38: startAudio returns a promise I guess? #demo-trio-web
Solution: encapsulate in then
.
10:43: move message works! #demo-trio-web
10:47: send to processor via messages now #demo-trio-web
10:51: initial message passing implemented #demo-trio-web
I'll have to do the rest later.
12:00: Audio Hang
13:17: More audioworklet stuff #demo-trio-web #timelog:02:51:07
13:21: Why is gate not being read in chromium #demo-trio-web
I'm mouse events are happening either way
Logic bug. And should have been an or. Struggled to clear cache on Chromium too.
13:27: Get wasm stuff working #demo-trio-web
13:29: My println stuff is causing issues I think. #demo-trio-web
I'm not going to do macros, I'm just going to comment things out.
13:37: Huh. it's not generating a wasm at all actually, even though there is no error. #demo-trio-web
Had to generate cdylib.
13:43: More unknown opcodes in wasm-gc. I wonder if there's a way to get more info on that #demo-trio-web
It went from unknown opcode 192 to 193, after I commented out all the panics.
13:50: broke the tests last night with more notes I think #demo-trio-web
Nope, that wasn't it. Sigh. Gotta bisect.
Bisect tracked down the quick bugfix I made that checked for valid lower/upper values.
14:21: A messy fix for now, but things don't panic and tests pass #demo-trio-web
14:27: A very odd opcode was generated in wasm #demo-trio-web
Opcode 193 is something that extends a signed opcode, which caused wasm-gc to panic. The lines that do that are in the measure_movement
function. It was quick to find because it's the only time I deliberately cast from unsigned to signed. I happened to use i16 which caused the panic to happen. However, changing to i32 made things work just fine.
14:31: right. Now I need to get the wasm in. #demo-trio-web
The exports and the interface are already there from the C prototype. It should mostly drop right in.
15:00: There should be sound. But there is no sound. #demo-trio-web
15:25: Still not sure where there isn't sound. #demo-trio-web
15:28: I forgot to return true, but there still ain't any sound. #demo-trio-web
15:46: Okay, now there's sound. #demo-trio-web
I wasn't passing in the dsp parameter.
15:47: Attempting to run on phone. #demo-trio-web
16:11: Good enough. Taking a break. #demo-trio-web
19:30: Back to the hub. Didn't actually get anything planned done.
I did, however, manage to get jackdaw running on my computer <<misc/jackdaw>>. I was very impressed with project, and didn't realize just how much tooling was built for it. Things like a layout editor, a YAML parser, and an XMl parser.
The audio devices weren't querying correctly on my computer, and we discovered that it was most likely a problem with SDL on my computer.
2024-07-29 Day 71. Week 11.
jump to top11 days left of batch
This past weekend was very heads down. I didn't do my usual morning triage.
08:54: Morning triage.
09:40: How does this sound today? #demo-trio #timelog:00:40:00
Added a new shape. Really expressive now. Vibrato not as bad as I remembered (with headphones on now instead of speakers).
Should resolve to tonic more often. There should be a built in bias to choose tonic.
I'm desiring the instantaneous mode again.
The chord changes are happening too quickly. If I select a pitch, go down quickly, and og back up, the chord shouldn't change. But it does.
10:11: Sounding chords are different from hypothetical ones #demo-trio
Right now, a changing pitch triggers a set of chord progressions. But if they move too quickly, only the last one is heard. Which means effectively an arbitrary chord transition.
How to prevent this? A chord isn't a chord until all 3 voices commit. If the lead changes pitch, it uses the stored chord as a reference point. As it continues to change, this chord will continue to be used. The upper voice is the last person to commit to a chord, so only then should the reference chord be updated.
When the chord is officially committed for selection, only then does that chord get registered in the chord frequency table.
Oh my god this is consensus in a distributed system.
10:25: Tweaking the resume. #resume-follow-up #timelog:00:52:00
What are the changes?
Talk about Recurse Center. Looking at other resumes, should go into "education section". What did I learn about? Some topics, ordered by time investment: Rust, React, Typescript, Low Level Computer Architecture, (elements of computer systems), Gameboy Development.
More notes on Boomy: "Optimized WAV file generation using X and built method for measuring throughput". Prototypes: Describe what was built, intended functionality, tools used. It's okay if they weren't actually used.
Muvik: "Designed and implemented core audiovisual rendering pipeline in C and Lua". Also include bullet for study and link to patent.
I'm not consistent about ending bullet points with a full stop.
10:37: RC should be in experience, not education #resume-follow-up
The education section is just bullet points. Not much detail.
oh hey, it looks like I already had some details for this one.
Okay no details.
11:17: Pack up. Off to lunch.
14:30: Working out the "chord committing" idea on paper. #demo-trio #timelog:00:40:08
15:16: Considering this task done #demo-trio-chords
Even though I still want to tweak the chord selection algorithm, the core work is done.
15:17: Begin initial web port #demo-trio-web
15:20: Writing out boilerplate HTML/js #demo-trio-web #timelog:00:36:46
Doing this by hand, as a learning experience. Poke used as reference.
I am using "modules" now to help me better break up my code ahead of time.
15:42: working on some initial p5 work.
16:07: initial webaudio scaffolding #demo-trio-web #timelog:00:38:01
16:31: What's a good way to get sound/visuals to talk? #demo-trio-web
Trying to do this the mildly correct way.
I'm making a new class off of p5, adding my own data to it.
16:41: Adding intial processor code #demo-trio-web
No WASM bits yet, though there should be placeholder
16:45: That's all we have time for. #demo-trio-web
Wrapping up this session.
19:07: More web porting #demo-trio-web #timelog:01:51:28
19:09: I'm turning this into a typescript project. #demo-trio-web
19:17: I am fighting the code formatter #demo-trio-web
npm i gts --save-dev
npx gts init
This seems to solve the formatter
npx tsc
19:32: you know what on second thought, just keep on doing JS I'm tired #demo-trio-web
19:49: I just lost my main work. This sucks. #demo-trio-web
Accidentally deleted sketch.js.
Might as well try typescript again.
20:20: p5 does not want to play nice with typescript #demo-trio-web
20:28: great, now I'm looking at the p5 codebase #demo-trio-web
20:50: "npm test" generates the p5.min.js file great #demo-trio-web
There is literally one line that is the issue. I am going to attempt to patch this on my end.
21:08: let the record show: we tried to get p5 working. #demo-trio-web
2024-07-28 Day 70: Trio Chord Selection Heuristics.
jump to top12 days left of batch.
11:00: Implement upper/lower voice finder. #demo-trio-chords #timelog:00:09:43
11:11: Plan out how to insert chords #demo-trio-chords #timelog:00:34:42
11:41: I have built an initial "Chord Manager" #demo-trio-chords
I believe I have identified the entry points for the chord manager in the existing code, and I have built an initial dummy interface for it. There is a failing test now with the methods.
If I can implement this interface, I think it'll be ready for actual integration.
11:47: Implementing chord manager #demo-trio-chords #timelog:00:49:00
11:49: Populating the chord table. #demo-trio-chords
12:22: Upper/lower movement could be different #demo-trio-chords
Upper/lower voice selection is based on minimal distance from lead voice. This could also change to be based on the minimal distance from the voice it is currently at while also being above/below the lead.
Writing this down, I'm realizing there's potential for chaos (an upper voice that somehow keeps drifting upwards, for instance). Could be a cool thing though.
12:36: tests seem to pass for the initial chord manager #demo-trio-chords
12:37: Let's see if I can make this work #demo-trio-chords #timelog:00:17:39
12:56: I want a minor VI in there #demo-trio-chords #timelog:00:20:00
13:03: Harmonic continuinity is coming through, but better voice leading needed #demo-trio-chords
Also running into a bug with chord harmonies. I get wrong chords. If I keep tapping on Do, I'll land on sustained chords, which aren't correct.
13:07: Oh right, I think I need to make sure duplicates are ignored #demo-trio-chords
Still not fixing the bug I described above. The lower note is incorrectly holding onto the note of the previous chord for some reason.
13:16: Pulling myself away. #demo-trio-chords
14:49: Fixing the weird initial chord bug #demo-trio-chords #timelog:00:07:32
Retriggering the lowest note causes it to alternate between a major chord (correct) and a sustained chord (incorrect) The upper note is correct, while the lower note is incorrect.
14:59: It never chooses anything but the top 2 right now #demo-trio-chords
15:06: Minimal movement, and better choices #demo-trio-chords #timelog:00:19:50
The state transitions aren't shuffling through the states as well as I'd like. Also, there's too much unnecessary voice movement.
How can the system choose interesting chords and also minimize movement?
Specifically: I want the system to be introduced to the minor 4 (without resorting to randomness), and I want the voices to resolve in an expected way.
It is a constraint problem. Each voice has different behavior.
Upper voice should move as little as possible while still being above the lead voice.
Lower voice should move the minimum number of steps required to complete the chord, based on what the current upper and lead pitches are.
New heuristic idea: of the candidates, choose the least used chord so far. Probably keep a running heat map. This solves the chord selection problem, but not the voice leading issue
Another heuristic for chord selection is to find a candidate chord that requires the least amount of movement, given the current upper/lower pitches. This would require storing the current pitches of upper/lower, calculating the upper/lower pitches for each chord, and getting the combined step change for each chord. The one with the lowest score wins.
These are two different heuristics for selection, and I can imagine there would be a way to choose one over the other.
15:27: Implement the "least used chord" heuristic #demo-trio-chords #timelog:00:58:01
16:24: Heuristic implemented in tests, now to try it out #demo-trio-chords #timelog:00:09:35
16:36: Least Movement Heuristic thoughts, test building #demo-trio-chords
Least movement: for each candidates, get the working upper and lower notes of those chords given the current lead pitch. Get the absolute differences between those and the current lower/upper. The one with the lowest score wins.
A good hybrid could be something called "Lazy Least Used". Go for the least used, but only if the movement required is under a certain score. Otherwise, choose the one with least movement
16:42: Working out a least movement test. #demo-trio-chords #timelog:00:18:34
Firstly, make sure the low-level chord measurement mechanics are in place and working properly. Then, simulate a chord change.
16:57: Implement measure movement core logic #demo-trio-chords #timelog:00:04:33
17:02: Work out least movement heuristic test #demo-trio-chords #timelog:00:23:06
17:10: Staggered voices complicate things #demo-trio-chords
You need to store what the pitch actually is somewhat reliably. There isn't currently a caching mechanism in place for this.
The entry points for this happen when a pitch is actually scheduled.
17:20: What happens if you schedule a note that is the same? #demo-trio-chords
Minimal movement means lots of situations where a note stays the same.
Shouldn't really matter, for now. It'll cause more complication if a check is introduced (for example, handle what happens if there is no previous note).
17:26: Implement least movement heuristic #demo-trio-chords #timelog:00:23:20
17:50: Hook it up into the system #demo-trio-chords #timelog:00:19:03
18:05: why is it crashing now? #demo-trio-chords
18:21: It works! Next steps #demo-trio-chords
Lazy Least Frequent Heuristic, Y-axis expression.
Nice thing to have: some way to go from staggered to instantaneous behavior.
18:26: LeastMovement crashes due to unwrap still #demo-trio-chords #timelog:00:06:33
Fixed.
18:35: packing up. time for dinner.
19:47: Attemping the Lazy Least Frequent Heuristic #demo-trio-chords #timelog:00:14:43
This one seems difficult to make a test for. I am tempted to just implemented it without tests, as the amount of effort it would take to find the perfect test situation wouldn't be worth it.
20:03: It appears to be working. #demo-trio-chords
This is a good stopping point for my chords engine.
20:04: More reverb, predelay too. #demo-trio #timelog:00:14:42
21:02: Some expression control #demo-trio #timelog:00:43:36
2024-07-27 Day 69. Audio Telephone Composing.
jump to top13 days left of batch.
08:47: Curation of audio samples for audio telephone
13:09: I have some pads now
14:00: And we're back. Time for drums?
15:27: Programmed a snare pattern. Now for a swarm texture.
16:05: we have swarm, but can it be pitch controlled by gesture?
16:33: Time to drop stuff into audacity
17:13: Uploading to drive
Write permission issues.
17:19: Comitting changes
20:18: implementing fallback chords #demo-trio-chords #timelog:00:17:27
It occured to me yesterday that I am actually close to a proof-of-concept system. I already have one heuristic working. I don't need to build out the others yet.
I do need to build a fallback set of chords in case a chord isn't found for a particular note.
20:44: Figuring out how to actually hook this up #demo-trio-chords #timelog:00:28:53
21:42: Find upper/lower voices of chord... how to? #demo-trio-chords
Instead of a lookup table, dynamically find the closest voices above and below a lead note given a chord.
21:52: Note to self: chord select needs to reset on note off some how #demo-trio-chords
21:57: Initial tests made. That's all I can do. #demo-trio-chords
I am incredibly exhausted today. What a long day.
2024-07-26 Day 68. ChordSelector work.
jump to top14 days left of batch.
Prev: Poke facial expressions, ChordSelector Planning, Reading
Next: More ChordSelector work, Balloon port for poke?
09:04: Morning Triage
09:24: Might try to get to this today #voxbox-balloon
I can test it out in Poke
09:26: Publishing
09:36: Responding to Jobs Chat
09:51: Implement query. #demo-trio-chords #timelog:00:59:53
Query: given a current chord, and a new lead note, find all the valid chords to transition to.
12:57: transition tracker #demo-trio-chords #timelog:02:41:39
13:50: Going to rework query to be immutable
This requires moving the mutable stateful stuff out.
14:06: Something is wrong with the query. #demo-trio-chords
I am trying to transition D4 in the key of C. This should give me two chords (G and Dm). But I'm getting no chords.
Ah, my "RE" constant was miscoded. Phew.
14:13: Need a way to remove candidate chords in place #demo-trio-chords
Chords need to be removed.
14:23: I'm running into overflow issues. Why? #demo-trio-chords
Oh just figured it out. I was using "chord index" instead of "index".
15:07: Initial test created for note transition chord test #demo-trio-chords
This has an interface that applies the heuristic to the candidate, which effectively removes the previous chord in place.
15:12: kind of a clunky interface, but it can be reworked later #demo-trio-chords
I'm querying the chord, removeing the previous transition chord, and updating the note pair table in 3 separate steps. It feels maybe too low-level and could be consolidated better?
15:22: remove previous transition not working as expected #demo-trio-chords
It has no effect at all apparently.
15:34: I think I found an off by one error #demo-trio-chords
I'm 99% sure it's the problem, but is it worth the problem?
There are two kinds of references right now: zero-index and one-indexed. The one-indexed ones are used because it means 0 can be reserved for None. was_used_last
expects a zero-indexed chord. But really, these indexes are transparent for the most part. add_chord
returns a zero-indexed position as well.
I think it makes sense to 1-index everything because that 0 is so critical.
15:52: Fixed indexing #demo-trio-chords
Wow, that was tedious. Glad I had the tests. It almost makes me wished I used Option
...
15:56: What happens if the current note doesn't have a valid state transition? #demo-trio-chords
There is no way to account for this yet. It will catch it as having zero candidates though.
15:58: I'm now realizing that my look-up table should just be 12x12, not 12x11 #demo-trio-chords
Because it's harder to calculate what that second index is going to be. There also needs to be a check to make sure prev/next is not equal.
2024-07-25 Day 67. Chord Selector Planning.
jump to top15 days left of batch.
Prev: initial dagzet rust port completed.
Next: work out chord engine logic in trio. add more facial expressions in poke, port balloon signal generator to trio?
08:43: Morning triage
08:53: Might be worth trying to port balloon #voxbox-balloon
I have just created the task for it.
08:56: Might be a good day to do some resume follow-up stuff #resume-follow-up
I got some good suggestions.
10:02: finished up responding back to jobs chat.
10:03: publishing.
10:06: laughing facial expressions in poke. #poke-laughter-chitter #timelog:01:05:39
12:54: Trio Chord Selector planning #demo-trio-chords #timelog:01:11:47
14:41: Plan out some initial code #demo-trio-chords #timelog:01:16:26
The initial structures: chords, and valid chord transitions.
15:53: Initial chord structure, now to think about querying #demo-trio
We want to make a query: given some pitch (0-11), and a current chord, provide a range of potential candidates for the next chord.
Since this is going to be used in a realtime situation. My thinking is to populate with a pre-allocated fixed size array that is managed internally.
16:00: Presentations
19:27: Wrap-up chapter 4 (machine language), begin chapter 5 (Computer Architecture) #read-elem-compsys #timelog:01:00:09
2024-07-24 Day 66. Initial Dagzet Rust Port Completed.
jump to top16 days left of batch.
Prev: Dagzet initial commandset almost done, but different strange things break when I try to use it. The Trio voice manager mostly works except for that one weird timing thing, which ain't bad.
Next: try to get rust dagzet working with my RC knowledge grpah. investigate timing logic and think about chords in Trio.
09:40: Later morning triage.
09:51: This feels completed. #demo-trio-voice-scheduler #demo-trio-low-timing
There's one timing issue that I'm going to look at, but that feels like another task.
09:52: Description of timing issue #demo-trio-low-timing
Sometimes, when I put the pen down on the tablet, the lower voice will wait an extra beat before turning on. I can see in the window that the time value becomes "1" twice, which I don't want to happen.
I think the issue here may have to do with how I am handling clock resets. It's possible that I missed something when I quickly add it in yesterday.
10:04: Our little fella needs to enjoy themselves #poke-laughter-chitter
The sound could be done to facilitate this better. But I also think the eyes should change from bug-eyed to the kawaii "> <" eyes.
The graphics will be a faster change than the sounds, so I may want to get started on that in the next day or so
10:11: A more sophisticated chord selection algorithm is required #demo-trio-chords #demo-trio
The issue is there is no sense of resolution anywhere. It still kinda meanders.
10:16: Publish
10:27: More dagzet integration work #dagzet-rust #timelog:00:42:14
10:34: I don't think I finished the topsort work #dagzet-rust
11:09: topsort bug fixed. I just didn't finish writing #dagzet-rust
The test I had for it was too small.
11:10: nice! running into invalid command error, which is what we want. #dagzet-rust
11:11: Working on cx command. #dagzet-rust
"cx" is an external connection, and it allows you to make connections to nodes that are external
A partial implementation of cx is all that is needed for now. No aliases, no shorthands. It's all just full paths.
Any nodes used by cx
get stored in a list (set?). When unknown nodes are checked, if it can't find the node in the locally created nodes, it'll check the nodes here.
Connections will work the same way since it utilizes a SQL command. The burden is on the dagzetter to ensure that nodes get created before they are connected with cx
. This is different from co
, which follows a more declarative style.
11:38: set up tests for initila cx behavior #dagzet-rust
11:50: now the implementation #dagzet-rust #timelog:01:40:38
12:09: now to try and hook things up again #dagzet-rust
12:12: In generate edges, I need to ignore external connections #dagzet-rust
At this point, any unknown nodes should have checked for.
Is there ever a situation where an edge would have an unknown node missed by the unknown node error check?
12:22: I think it works. One quick look around. #dagzet-rust
12:24: Nope. Full page generation is causing a crash in genpage.lua #dagzet-rust
It's a sqlite error. I need to figure out what the error is.
I need to use db:errmsg
I didn't configure the column name correctly.
12:34: More sqlite errors. I think I'm forgetting to name the columns #dagzet-rust
12:36: The rest of the tables need to be implemented #dagzet-rust
12:54: Running into a really odd stack overflow error #dagzet-rust
I'm going to have to call it quits after this.
13:00: We are getting loops in the output. #dagzet-rust
It is random too, probably due to randomness in lua tables.
Something is going wrong in the SQLite table generation. When I place debug prints in the shortname
generator, I get a ton of stuff.
16:46: debugging #dagzet-rust #timelog:00:45:48
My goal is to take a look at the generated table outputs of both.
17:00: number of nodes are consistent, name order is not #dagzet-rust
This is probably due to the randomized hashmap structure in Rust.
But, the node IDs are the same, monotonically increasing unique values.
17:02: lines output is identical as well #dagzet-rust
17:14: something is wrong with the connections #dagzet-rust
There seems to be duplicates. I generated and sorted the connections alphabetically and found these.
The rust version also has inconsistent connections as well.
17:18: connections: 127 rust dagzet vs 142 lua dagzet #dagzet-rust
It is consistently this. I wonder if "cx" is partially to blame. disabling all the cxs now.
17:21: connections are the same without cx. #dagzet-rust
There are 15 cx connections made, and that accounts for the missing connections.
17:22: I wonder if it's the edge generation somehow #dagzet-rust
17:25: I think I should be doing a name lookup, not using the ID values directly #dagzet-rust
17:30: Okay! that fixed the connections issues. #dagzet-rust
Now, let's see if the error goes away.
17:32: FINALLY. IT WORKS. #dagzet-rust
I think I really mean it this time. It generates the data without crashing and everything. Peaking around, it seems to produce the data just fine.
18:21: I have taken a break broadcast my success. Heading out now for dinner. #dagzet-rust
20:14: More reading on machine language. #read-elem-compsys #timelog:00:45:17
2024-07-23 Day 65. Voice Scheduler Feature Complete. Rust Dagzet nearly feature complete.
jump to top17 days left of batch.
Prev: Work on new voice scheduling system for Trio. File range command in Dagzet. I got to play with this really cool haptic knob controller (NANO-D evaluation kit)?
Next: || dagzet-rust: Finish up file range, and work on the other remaining commands. || demo-trio-voice-scheduler: Continue implementing triggers for specific events with tests.
08:38: Morning triage.
08:39: Clearly not getting around to this #LC75
I am putting it in "main" instead of priority.
Let me get these last few projects in a good place (demo-trio and dagzet-rust), and then I'll try to find time to start this?
I know, I know. I'm putting it off.
08:56: getting ready to fix merge conflicts.
09:07: Publishing
09:20: Actually, before publishing, fix compeng page
09:25: fixed logdb generator bottleneck
Classic mistake: forgot to add BEGIN/COMMIT
09:31: file range table generation #dagzet-rust #timelog:00:07:14
09:39: Off to implement hyperlinks #dagzet-rust #timelog:00:37:18
10:16: TODO task #dagzet-rust #timelog:00:13:18
10:30: Tags (tg) #dagzet-rust #timelog:00:31:30
This one is slightly more interesting because of how the data is represented in SQLite vs how it's entered in dagzet. From a dagzet point of view, tags are a list of entries associated with a node. In SQLite, they boil down into pairs.
Tags should be able to be called more than once.
I think I'm only going to have time for the initial test before I break for lunch. Maybe.
11:01: Test is breaking in a weird way. #dagzet-rust
I'm not sure why it's not getting the correct number of tags? Oh well, will have to look at it later.
12:21: What is going on what this test #dagzet-rust #timelog:00:31:14
I was misinterpretting the boolean result of the hashset insert method. Negating that fixed stuff.
tags finished.
12:53: Attempts to move test to another place #dagzet-rust #timelog:00:08:11
I think for the most part these could be thought of as integration tests.
Nevermind. I just won't fight it right now.
13:03: Select node command is last on my list I think #dagzet-rust #timelog:00:12:18
13:18: I think it's all implemented? Now for an initial replacement test #dagzet-rust #timelog:00:05:23
13:21: Wow, I think it might have worked on the first go. #dagzet-rust
Installed with =cargo install --path .". Replaced my lua program with my rust program. Looked around on the generated ouput. It seems to be right so far? Publishing to see what will happen.
Yeah, it says it is fine. so that's pretty cool.
13:23: Adding a README #dagzet-rust #timelog:00:19:13
13:43: I forgot about cx. #dagzet-rust
I've done enough for one session. It's almost there.
13:45: I need to see it crash. #dagzet-rust #timelog:00:16:18
It should have crashed. Make it crash, then I'm done for now.
oooh. the "dagzet" tool and "gendb.sh" are different. I gotta replace it there.
13:51: Okay it crashed because I forgot to escape. #dagzet-rust
14:01: Lots more errors. This makes more sense now. Halting for now. #dagzet-rust
I need better error reporting. It seems like this version of dagzet has stricter parsing (you can't have duplicate tags, for example, whereas my old implementation could).
14:13: Trio Triggers #demo-trio-voice-scheduler #timelog:01:24:34
14:44: Already shooting myself in the foot with this code #demo-trio-voice-scheduler
Going to create an interface where hooks are automatically appended and interated through.
15:22: Working on UpperVoice now #demo-trio-voice-scheduler
Lower voice seems to behave as expected.
15:53: UpperChange works as expected. Triggers are done #demo-trio-voice-scheduler
15:55: Voice scheduler hookup #demo-trio-voice-scheduler #timelog:00:42:19
16:39: Voice manager has been integrated into Trio! #demo-trio-voice-scheduler
Most of the event logic behaves as expected. I'm a little stumped as to why is starts so late. But it feels a lot better now. What's important is that I can reason about it now, and construct simulations in a very precise way.
I think the event handler needed to reset the head/tail indices as well. That fix seemed to clear up all the gesture related issues which I was suspecting.
19:21: Reading: Machine Language #read-elem-compsys #timelog:00:53:45
2024-07-22 Day 64. Haptic Knob Controllers.
jump to top18 days left of batch.
I think Haikus have run their course.
Prev: briefly worked on rust Dagzet stuff dagzet-rust, I have an idea of what I need to build out still to get it working on the RC knowledge graph here: check it out.
Next: || dagzet-rust: work on file range command, in addition others. || LC75: study planning ideas || demo-trio: draft out voice state machine
08:50: Morning Triage.
08:57: How can I work on this stuff offline? #LC75
Leetcode provides validation of solutions, which is nice. But having a format that distills the problems into something offline for reference would be nice.
I think I might just manually transcribe the questions to markdown and put them in a repo.
09:03: good time to figure out pikchr? #demo-trio
I'm trying to more formally establish a way to manage how the voice scheduling works for trio.
09:07: Publish
09:16: Some initial inking of voice scheduler #demo-trio-voice-scheduler #timelog:00:49:09
10:36: Attempts at initial code and tests #demo-trio-voice-scheduler
11:23: How to set up event scheduling logic? #demo-trio-voice-scheduler #timelog:00:59:47
I have the basic actions and primitives for the most part. Now, they need to be composed and worked together to trigger events for the other voices.
These events need to be triggered exactly once.
Event triggers at the start of a tick.
In a test, one should be able to tick() then ask the state questions. Is the voice supposed to launch? Is it playing already? Did the voice just turn off? Is it supposed to turn off? Etc, etc.
These events need to happen exactly once. Tick, check and see if things need to be done, tick, those things are done.
prev/curr state system could probably do this? At each tick, update prev to be curr. So, that means changes and probing would have to happen together before the next tick?
13:10: Inking out thoughts on triggers #demo-trio-voice-scheduler #timelog:00:20:12
13:35: Attempts to implement triggers #demo-trio-voice-scheduler #timelog:01:57:19
13:53: Core trigger built. Now to try to use them #demo-trio-voice-scheduler
14:36: Inking and brainstorming #demo-trio-voice-scheduler
I was trying to figure out when to trigger certain events like gates and pitch changes. I was has having difficulty getting to the bottom of pitch changes for auto voices: when they are initially turned on (happens when the gate is first on), the pitch needs to be instantaneous, otherwise it is scheduled. How does the voice know the difference? What state information is needed?
14:43: Introduce two boolean gates indicating on/off state for auto-voices? #demo-trio-voice-scheduler
This is the best I got so far.
Consider the state needed to determine a pitch change for an autovoice: it happens when when the lead note is on, and the timer has reached a particular value. An extra value could be used to determine whether to immediate set or schedule based on if the voice is turned on.
An auto voice will always be immediately set to a pitch first, and would therefore set the voice flag to on. Subsequent voice changes will be scheduled until this voice flag is turned off.
The only way the voice flag is turned off is when an gate off event happens. This will turn off all the voices in the state.
14:51: attempting to code this line of thinking up #demo-trio-voice-scheduler
I'm tempted to have this be a set of state separate from the current set of state. I believe you could make the case that the flow of information is one-way: changes from the main state influence the states of the auto voices, and not the other way around.
Separating state like this I think will make it easier to test.
15:14: These states need triggers hooked into them now. #demo-trio-voice-scheduler
15:41: initial firing logic works for a lower voice #demo-trio-voice-scheduler
Right now, it doesn't know the difference between an instantaneous vs scheduled pitch change, but the scaffolding is there.
16:55: dagzet things? #dagzet-rust #timelog:00:05:00
18:55: dagzet: file range work #dagzet-rust #timelog:00:28:08
19:16: initial tests pass, but they are wrong #dagzet-rust
19:38: Attempts to get the file range table working. #dagzet-rust
Okay, nevermind.
20:16: Got distracted.
Haptic rotary knob controllers. What was I supposed to do?
2024-07-21 Day 63. Week 10.
jump to top19 days left of batch.
09:15: Morning Triage
Prev: DAG Zettelkasten / generates database code / more will come later Next: || dagzet-rust: Figure out remaining commands needed to get things working in the RC dagzet, and get them working in the implementation
09:40: Publishing
09:44: Figuring out which commands I need to make #dagzet-rust #timelog:00:14:09
I just need enough to generate my RC dagzet and replace the lua implementation.
The results of my shell-ing:
$ cat *.dz | awk '{print $1}' | grep -v "^$" | sort | uniq -c
127 co
50 cr
15 cx
59 fr
38 gr
165 hl
357 ln
280 nn
40 ns
96 rm
1 sn
15 td
76 tg
4 zz
Done: co, cr, ln, gr, ln, nn, ns.
TODO: cx, fr, hl, rm, sn, td, tg, zz
Wow that's more than I expected. And there's going to be a non-trivial amount of time troubleshooting the SQLite generated code because of course it's not going to work on the first try.
More slow, incremental testing?
No, the errors are going to come from incorrectly generated SQLite code, and SQL code validation is outside the scope of dagzet.
10:00: Let's port some more commands. #dagzet-rust #timelog:01:05:39
10:03: graph remarks table generation #dagzet-rust
10:18: connection remarks: needs edges table #dagzet-rust
I need to generate that once, and cache it in dagzet.
10:23: actually, no not really. SQLite does ID lookup #dagzet-rust
The connections can remain as strings.
10:31: comments now #dagzet-rust
10:37: node remarks #dagzet-rust
10:51: File ranges. (fr) #dagzet-rust
Oh yeah, that's right. This has some shorthand behavior as well. '$' is used to reference the last file. This might take up the rest of my morning before I break.
Some of the slowness comes from building up the test correctly. I'm doing this one right, becaues I've already done it wrong.
11:09: Some placeholder test code for file range. #dagzet-rust
Comments in place, testing a handful of edge cases. Now I just need to implement incrementally. That's for another time.
2024-07-20 Day 62. My generics start to become unhinged.
jump to top20 days left of batch.
Zulip checkins: I don't think this is working for me. I tried out this silly daily haiku thing with an end of the week digest, and it didn't get the traction I wanted. I feel like I'm adding unnecessary noise. So, no more Zulip check-ins for a while. I will continue to log here.
On a more personal note, Zulip is triggering some of the anxiety-induced FOMO feelings similar to the ones that prompted me to delete Facebook a decade ago. Crazy to think that even after all this time and growth, I still find myself reacting to situations in same way (how disappointing). Going to have to delete Zulip from my phone after the batch is over.
Prev: Synthesized Voices / Three of them in a black box / Controlled by Gesture
Next: || dagzet-rust: Work on cycle checks today. || demo-trio: Begin building a more formal voice handling state machine model.
10:00: Late morning triage.
10:26: Does renoise work on Alpine?
No. Not it does not. Typical glibc woes.
10:56: Publishing.
12:14: Work on loop checker #dagzet-rust #timelog:00:54:23
13:22: Loop checker passes test. I think it works. #dagzet-rust
13:52: Begin SQLite code generation primitives #dagzet-rust
13:53: Initial thoughts #dagzet-rust
Most of this boils down to string generation. It'd be good to have some intermediate structures before that.
Being able to generate sqlite schemas is a good start. A table as named values, each with a type.
It would be nice to have a consistent ordering of these names, for things like generating insert statements.
A table itself also has a distinct name.
stringify()
could be a behavior that types and tables implement.
Ah, stringify is already a behavior the standard libary implements.
14:04: Making a new file. Thinking about SQLite params. #dagzet-rust
14:30: Got some tests going. Now working on a table. #dagzet-rust
14:51: Now, how about some insert statements #dagzet-rust #timelog:02:22:15
The best thing would be to have some struct map to the VALUES
part in an INSERT
statement. A method on the table, such as sqlize_insert
, could then take in a pointer to a struct implementing that row that's able to generate values.
Trying to un-confuse myself: supposing I had table A. I'd want to make it so A.sqlize_insert(row)
would generate an insert statement from data in row. Another table B attempting to call B.sqlize_insert(row)
would get a compiler error due a type mismatch.
Generics for trait? Like this trait can only work when the type is A, not B.
15:07: Attempting initial insert row logic #dagzet-rust
15:18: My table abstraction could be better #dagzet-rust
The table schema itself needs to be a concrete type, which it is not right now. If that happens, then I can make a row type for that table.
15:21: Make table a concrete type. #dagzet-rust
15:33: Working backwards. #dagzet-rust
Define the interface that I want to see in a test, then work backwards from there.
15:36: Maybe what I want are phantom types? #dagzet-rust
See: <<rust/phantom_types>>.
15:55: The code feels close, but Rust compiler still doesn't like it #dagzet-rust
16:02: Hooked something up using phantom types #dagzet-rust
16:13: Tests passed. I believe this is what I want? #dagzet-rust
16:32: start working on the CLI, generate nodes table #dagzet-rust #timelog:00:28:30
Real quick stuff.
17:01: Some things print. #dagzet-rust
Lots of elaboration needed, but it's a great start. Things work.
18:35: Get more things to print #dagzet-rust #timelog:02:10:58
18:41: IntegerPrimaryKey shouldn't be used in insert. #dagzet-rust
18:56: Oh goodness. These generics are beginning to be chaotic #dagzet-rust
I'm trying to rework the nodes table code into something more re-usable.
It feels like it's just starting to slip into something out of my grasp.
18:59: Ah okay, the rust syntax gets weirder too #dagzet-rust
This makes me feel better. It has an easy code smell.
Make this an interface somehow?
19:03: This isn't working either. #dagzet-rust
you know what they are just going to be functions. refactor later.
19:12: rework things to use writer instead of print! #dagzet-rust
I'm hoping to use the trick in C where the file handle can be standard out as well as a file.
19:38: writer works, but I can't get things to abstract well #dagzet-rust
19:49: A useable interface, now to move stuff out of main #dagzet-rust
19:54: Make clippy happy again #dagzet-rust
20:02: Finally, let's get another table in there. #dagzet-rust
Let's try the connections table.
20:11: Add the lines table. #dagzet-rust
20:47: Lines work. This is a great stopping point #dagzet-rust
Most of the hard stuff is accomplished I think.
21:00: Some impromptu showings of trio demo prototype #demo-trio
I turned on gesture again, and I think the wrong notes are caused by the event graph not removing events at the right time. I think. Anyways, this needs to be rebuilt for sure because I don't want to live life doing guess and check.
2024-07-19 Day 61. Gestured Voices Work in Trio.
jump to top21 days left of batch.
Prev: Negation in NAND / is a requirement for / Turing completeness
Next: || dagzet-rust: connection remarks, and plan next steps. cycle checking? sqlite code generation? || demo-trio: get the event-driven gesture working || gesture-reset: implement gesture reset that happens on phase reset.
07:48: Morning triage
08:12: Connection remarks next, then what? #dagzet-rust
My goal is to use this rust port as a drop-in replacement for my current dagzet program. So, a good thing to do would be to write a program that scrapes all the current commands used in my RC dagzet instance.
I also have more than enough here to begin working on implementing topological sort and SQLite code generation.
08:19: Clock resets are going to throw off gesture algorithm #demo-trio
One of the things I am doing in Trio is forcing a phase reset on pen down so the voices always stagger the same way. The funderlying rephasor in the gesture signal generator does not have a way of handling phase resets, so it'll assume a faster tempo change and cause unwanted jumps in time. The rephasor needs a way to be told that there has been a phase reset, which would cause it to skip calculating the delta time in the next sample tick.
I wonder if we can build a test around this?
08:26: Task created for gesture reset #gesture-reset #demo-trio
I want to not only do the fix, but create a test in the test suite that showcases the problem before the fix is created, just so I know that I properly understand the problem that I am trying to solve.
08:29: Publishing
09:04: Connection remarks in dagzet #dagzet-rust #timelog:01:02:06
09:06: Connection remarks notes: more difficult than expected #dagzet-rust
This is more difficult than I expected. A connection remark needs to reference a connection (the last connection made). So, how does one reasonably accomplish this?
Since the connections are a pair, one approach is to use a 2-dimensional hashmap of values. It feels like a lot more space is used up here. It also duplicates some of the logic in the connections hashmap.
Another thing to do is to somehow reference the connection. An actual Rust memory reference could cause the borrow checker to be grumpy. Using unique ID values to reference each connection, and having that be a key in a hash table could work. This could become a bookkeeping problem if I ever added the ability to remove connections, but there are no plans to add such a feature.
Connection ID can be their position in the "connections" vector.
Since there is no way to selecting a specific connection, it is reasonable to assume that the last connection will always be implicitely be what remarks are being written to. So, the ID will always be length of the vector (minus one for the index).
09:23: Making tests for connection remarks #dagzet-rust
09:33: Implementing connection remarks #dagzet-rust
09:59: I am weirdly stumped as to why this test is failing #dagzet-rust
It was silently erroring. I needed to add a namespace.
10:16: Time for unknown nodes. #dagzet-rust #timelog:00:21:57
10:20: before topsort, but unknown nodes #dagzet-rust
I initially thought I was ready for topsort and cycle checking. But unknown node resolution should happen first.
An edge list should be created from the connections for the topsort. But those should all be resolved ID values. "Check unknown" nodes should return a set of nodes that do not exist in the node.
Rust does have a set in their library! Huzzah! Called HashSet. That's what I need.
10:26: Building test for unknown node checker #dagzet-rust
10:37: Implementing unknown nodes. #dagzet-rust
10:39: Begin cycle checking #dagzet-rust #timelog:00:22:41
10:42: How to make this a testable component? #dagzet-rust
The trick here is that the connections need to be resolved, which is another step to test independently.
A connection list, after it has been verified, should be turned into an edge list. This function is still allowed to error out. This edge list then gets passed in to the cycle checker.
Note: Topsort makes use of sets that will expand and grow. I expect to dynamically generate these inside the function.
Generating an edgelist should come first. Panic on missing nodes for now. that can be fixed later.
The top-level cycle checker should be a method called check_for_cycles
. The original implementation was able to get some information on these cycles.
Looks like the original topsort populates a "loops found" list. I will do the same, only it'll be a HashSet. On success it will return an Ok, otherwise an error code.
10:51: Implementing initial cycle checker test #dagzet-rust
11:02: Placeholder tests and functions in place. #dagzet-rust
Implementation will come next. Signing off for now.
12:26: Implement Phasor Reset (attempt) #gesture-reset #timelog:01:06:58
12:32: Implementing initial phasor reset test #gesture-reset
Somehow, this needs to simulate a phasor reset, and measure specifically what goes wrong.
13:14: My initial idea of skipping a sample didn't work #gesture-reset
I need to re-remember how this rephasor works. Luckily, I wrote documentation for the C implementation.
The current rust implementation is a little cryptic. Add docstrings now.
13:33: comments added. going to scratch my head how to solve this one #gesture-reset
13:58: More thoughts on rephasor reset #rephasor-reset #timelog:00:32:20
Establishing a line of thought here.
A phasor reset occurs when the ramp suddenly resets back to 0 in the middle of period.
In this scenario, the reset happens as a result of an intentional interrupt. It is known precisely when this interrupt happens. The rephasor is called immediately after this interrupt. For this reason, the idea is it can be told ahead of time that the phasor reset itself, and to act occordingly.
Other than the interrupt, it is assumed that the clock is moving at a steady rate.
In order for the rephasor to handle a reset phasor properly, it must keep the overall increment amount approximately equal to the increment amount of the phasor. Any jumps indicate a failure to handle this result properly.
Assmuming rephasor scale amount is 1 (same as input), the increment of the rephasor is determined from the correction coefficient and the delta of the previous sample.
Possible solution: don't update anything with the clock reset artifact in it. Wait for it to pass through the system before updating correction and incrementor.
I think waiting two ticks should do it.
14:12: Trying to implement wait counter method #rephasor-reset
14:21: now test is passing when it shouldn't #rephasor-reset
14:23: oh, perhaps this is a non issue #rephasor-reset
I wrote the test wrong which is why it was failing in the first place. But, now it works.
I think this test might be resilient against this particular kind of reset beacuse the phase goes to exactly 0 and also that the overall slope doesn't change.
No changes needed. But the test is in there to prove this.
14:31: Time to actually hook up gesture to trio #demo-trio #timelog:00:36:25
14:48: This code is approaching "putting out fires" mode #deom-trio
I'm feeling a little overwhelmed at the moment.
Erased my changes. Took a breath.
15:04: Things sound better than expected. #demo-trio
It did not crash. Not one crash. Thanks, Rust.
15:07: A quick break. #demo-trio
15:25: More trio tweaks #demo-trio #timelog:00:50:22
15:26: Need a way to instantaneously set value of gesture. #demo-trio
This can be like a method or something like immediate()
. Setting the next/prev values to be some scalar value should work for the most part. The event queue should also be cleared as well.
15:53: Weird bug: playing the same note twice doesn't trigger events #demo-trio
16:04: A rewrite might be in order for this voice state management #demo-trio
There's too much spaghetti right now. I think I need to actually attempt to model this as a state machine and add some tests. Otherwise, I'm never going to figure out the issue, and this code will become too unwieldy to maintain.
16:13: Gesture is unpredictable. #demo-trio
Not sure if this is my scheduling logic, or some subtle bug with the gesture event queue, or a mix of both.
Testing is definitely in order. But like, not today. I'm pretty drained.
Yeah, considering I want to change the chord selection logic, it's going to be even worse. This needs a rewrite.
I have built what is essentially a concurrent event system. So yeah, this is tough one to do correctly.
16:20: Working on weekly recap now.
19:17: Reading #read-elem-compsys #timelog:00:43:04
More reading on sequential logic.
2024-07-18 Day 60. Staggered voices work in Trio.
jump to top22 days left of batch.
Prev: event queue finished / eventful gesture looming / no more tic80
Next: || demo-trio: start work implementing voice staggering. || dagzet-rust: continue onwards with this || read-elem-compsys: more reading.
08:12: Morning Triage
08:39: complete in theory. time to test in trio demo. #event-driven-gesture #demo-trio
Attempt voice staggering using event-driven gesture algorithm, see what goes wrong.
08:47: dagzet could bea good portfolio project #dagzet-rust
It's got a good scope, it's got data structures, and I'm putting in the time to incrementally test as I go.
This is also a migration project. If I do this right, I should be able to drop in the dagzet program and have it replace my adhoc lua code for my knowledge tree generator, which I'm using to power the knowledge graph here.
08:53: This has been completed #rust-neovim-setup
08:54: Not today, vscode. Not today #vscode-rust-setup
vscode does not work on Alpine, and NeoVim works well for my needs.
09:05: publishing
09:08: Oh my god there's a clippy book
See <<rust/clippy_book>>.
09:34: connection shortcuts #dagzet-rust #timelog:01:03:58
09:39: oh no, I'm getting my task tags wrong #dagzet-rust
09:42: back on track. #dagzet-rust
10:00: Reminding myself why string hashmaps are used for nodes #dagzet-rust
hashmaps are used as the data structure to ensure that a node isn't created twice.
if I wanted to get a node name from a string, how would I do that? Right now, the answer seems to be enumerate through all the keys and find an answer. I could also just make a separate inverse lookup table out of a vector.
10:20: I made an inverse lookup table #dagzet-rust
It's a memory hit, and there's room for the eventual possibility that the tables will go out of sync. But, it's good enough for now. I imagine I'll be doing this lookup operatoin quite a few times I don't want to do a linear sweep every time.
10:50: Up next: connection remarks #dagzet-rust
Variation on a theme, I suspect. Unplugging and getting lunch now.
12:35: Time to revisit the trio. #demo-trio #timelog:03:01:06
Now that I supposedly have an event-driven gesture, I need to begin work incorporating those mechanics into the existing demo.
12:36: refamiliarizing myself with the work. #demo-trio
13:00: light refactoring. main clock added. #demo-trio
Now, for the initial process of introducing these eventful gestures.
Before any events can be added to any gestures, we need to be able to simulate adding events. Events should only be added after the lead has held onto a pitch for one period.
The easiest way to do this using a global timer that keeps track the phasor ticks, which occur once a second. If a u32 is used, how many seconds is that?
Okay, so that's about a millenium short of 50k years. I think there's no general risk of things overflowing. I will be long dead and forgotten before that happens to a potential user.
Is this enough granularity though if I want to detect pitch changes? Maybe not actually.
13:11: How to tell if pitch has lasted for a period. #demo-trio
Might need to ink this one out.
13:27: Inked out an idea #demo-trio
First, implement things so that it detects changes in pitch. Then, build a sample and hold with along with a global monotonically increasing clock that increments at the start of every tick.
Now, any time a pitch change occurs in the lead, it can be compared against the sampled pitch. If the pitch is different, but the timestamps aren't, the whole period is marked as having been changed for the rest of the period.
The "dirty bit" trick is used in situations where the pitch changes quickly inside a period.
13:32: Implementing pitch changes, alert on change. #demo-trio
13:37: implementing clock and sample and hold #demo-trio
13:50: shoot. I'm actually unable to articulate when a change *should* happen #demo-trio
I think there may be another state required.
How about this: If there has been a requested pitch change from the current lead pitch, and this request has been held for a period, apply this changed pitch to the current lead.
This should happen at the start of every period.
14:05: well, now my logic isn't working at all. #demo-trio
I was setting things to be negative oops.
14:11: this timing logic is trickier than I expected #demo-trio
14:15: Going to try caching the lead value between takes. #demo-trio
Caching the lead value and also storing the last changed value may be what I'm looking for.
Essentially, this lead value needs ot be updated regardeless, but the other two voices need to be delayed.
14:19: Okay, the complexity comes from the lead value needing to be updated #demo-trio
I can't simply cache the last lead pitch every time because I'm setting it every time.
14:23: The delayed behavior works as expected now. #demo-trio
Now I am looking to delay chord changes on the upper/lower voices.
14:28: I think I might be able to work staggering without gesture #demo-trio
I think I'm going to have to do this anyways because the event scheduler in the eventful gesture is so small, and can really only take one vertice at a time.
14:34: reworking things for variable stagger times #demo-trio
The logic would need to be: lower gets set first, then upper.
Initially I had it so that the chord would be determined, and then you'd send off the events with delays. Rethinking this a bit.
I want it so that the upper pitch only changes if it waits 2 beats, and the lower pitch changes if it waits 1 beat.
14:48: I'm going to try having another state variable #demo-trio
I want to have a state when the lower changed, but the upper hasn't. When the upper gets changed, the lower changed gets setup.
15:01: Now I want the voices to be staggered on too #demo-trio
15:08: Weird timing bugs #demo-trio
15:14: Try to reset clock on new voice. #demo-trio
15:34: Clock works, but the voices aren't resetting pitches #demo-trio
This causes some very rude sounding glisses sometimes.
15:45: it works. clocking out. #demo-trio
16:00: Presentations.
I co-presented with Dan and Seamus on "bbl_vizzy".
17:00: Dinner
First time at stickies. Would go again.
19:00: Negation needed for NAND. (approximate time)
DF had a good pair of questions for me. First one was "what is NAND?" "Not AND". Follow up: "Why NAND? Why not AND?". I could not answer this well, but luckily JK was there. I am paraphrasing, but it turns out that the negation operation is important for Turing completeness. You need the negation to do things like turn a 1 to a 0.
Both NAND and NOR are what are known as "universal gates", in that they can be building blocks to implement the core boolean operations like AND, OR, and NOT.
19:32: Reading. #read-elem-compsys #timelog:01:01:16
2024-07-17 Day 59. Eventful Gestures and Dagzet
jump to top23 days left of batch.
Note to self: 3 times 7 is 21. 23 days feels like a small amount of time to me. And it is. But it's also a little over 3 weeks, and when I say it like that it feels like more time.
Prev: Sounds on the gameboy / worked in the emulator / not on the real thing
Next: || tic80-voxbox: I either need to scrap this project, or turn this into a presentable demo || event-driven-gesture keep moving forward with this, try to get a working proof of concept by today || compeng-resources Get resources imported into knowledge graph
08:17: Morning Triage
08:37: follow-up #impossible-day-2
I figured out that some of the example game does have sound working. I don't know what I am missing here. Don't think I'll have much more time for this during my batch though.
08:58: I need to scrap this project, or plan a demo #tic80-voxbox
I don't think I can justify putting more time into this, unless I can make something that is presentable.
09:02: Hoping to get an example by end of day. #event-driven-gesture
09:05: Publishing
09:11: Extracting links and resources from CE's page #timelog:00:36:40
10:04: back to this event queue #event-driven-gesture #timelog:01:26:19
10:05: Did I rework the abstraction already? #event-driven-gesture
10:11: Making clippy happy. #event-driven-gesture
I don't love this. This feels like busywork.
10:16: Monowav errors with cargo clippy #event-driven-gesture
I'm crashing when I run cargo clippy. Let's make it not crash. I can deal with the warnings later.
10:26: Back to event queue, gotta work with other types #event-driven-gesture
Scalars are being handled, now to work with the other types.
10:47: I am annoyed with clippy now #event-driven-gesture
This is why nothing gets done. I've been wasting time trying to make Clippy happy all in the name of supposedly "better" code, and I'm almost 45 minutes with nothing to show.
It warps the brain, seeing all these squigglys in an editor. Like a teacher marking up an essay in red. I don't think it's actually possible to turn syntax off in nvim. The usualy "syntax off" trick doesn't work.
10:49: I think the event queue is done? #event-driven-gesture
I think it's reasonable to think about starting the Gesture.
10:52: Incorporating initial event queue into Gesture. #event-driven-gesture
I'm liking these incremental tests. What's a good way to test that things are working as expected?
For starters, I need to be able to initialize this gesture by pushing an initial set of events.
11:31: got a test to pass for producing first event #event-driven-gesture
12:00: Audio Hang
13:33: Event driven gestures part 2 #event-driven-gesture #timelog:02:03:08
13:41: Need to test that events change when expected #event-driven-gesture
Behavior we want: events should only be processed at the start of a new phase.
Situation: I push an event A and call preinit
. I then push event B, after the tick is called. What is the expected set of states for each period?
Pre-init sets things up so that our first period for sure starts with A. I believe that first tick sets things up such that the next state it is transitioning to is also A. Either that, or it is uninitialized. Not sure. Checking.
14:13: event queue is not working as expected #event-driven-gesture
My test is currently failing to get the next point as expected.
Oh wait, it's because I'm checking the external clock's phase changes, not the internal change.
14:23: quick check on bbl vizzy
14:26: Trying to see what happens when using =new_period= #event-driven-gesture
14:36: I am using println to set up a timeline of events #event-driven-gesture
The order that things get set is very crucial. I want to see if I am doing the timing correctly here.
14:45: I am trying to understand the timeline #event-driven-gesture
It is eventually correct, but I have to wait about 5 samples after the first period.
Before the first tick, event A is pushed, and then preinit is called. This triggets the next
event to happen, which dequeues the event and sets the value.
At the first tick, next vertex is called immediately. There are no values. The cached vertex is left unchanged, so it is still A.
Note to self: I am using new_period
wrong. The timeline
Event B is pushed as an event before the next tick. It will not be read until the next period. When will the next period happen? The current phasor is currently set to have a incrementor value of 0.1, so 10 samples per period. The rate multiplier has been set to be 2/3, which is like slowing things down by a factor of 1.5. The test does seem to pass after about 15 samples through trial and error.
14:57: I am tracking the change in phase wrong. #event-driven-gesture
I need keep track of the interal rephasored value and watch for new periods there.
Works!
15:06: Now, onto waits #event-driven-gesture
Wait are events that tell the clock to wait some number of periods before processing the next events in the queue.
A wait time of 1 will wait one whole period before allowing further events in the queue to be processed.
If I push a wait of 1, and then a scalar in the middle of period A, the start of the next period, B will dequeue the wait, and then suspend reading other events. At the next period C, there will no longer be a wait, and the next value will be set.
15:47: Tests have been made. They fail. Now to implement it. #event-driven-gesture
15:53: Wait implemented, tests pass. #event-driven-gesture
16:02: Shelving tic80 voxbox demo for now #tic80-voxbox
This feels like a dead-end at the moment. Maybe if there was already an existing following of tic80 peeps who were adventerous to make their own builds. BUT, this is not the case right now.
I'm sticking to shareable presentations that work on the web. Unfortunately, the wasm is pre-built so there is no great way to hack audio into this.
I honestly don't have any meaningful things to do with this new hack I did. It serves no other purpose than "oh, I did it". It's not enough.
16:06: Attempting to implement connect #dagzet-rust #timelog:01:07:59
Connect adds a pair of edges to the graph.
These are set to be local connections. ID values work just fine here.
I'm curious about knowing how I did cx? Was that another structure? Am I checking cycles there?
cx is another structure and loops/cycles are not= being checked for.
16:37: I think connections don't check for existing nodes until the end? #dagzet-rust
My original dagzet implementation reperesents the dagzet connections as strings that are only checked at the end of parsing when all the nodes have been added. Only then does it attempt to resolve the symbols. This makes for a more permissive parser.
My dagzet parser does the topological sort with the strings. I think it would be better to resolve those chunks into local ID values before the sort.
17:09: Connections work, but unverified #dagzet-rust
This is intentional. Verifications are defferred until everything is parsed. Verification is beyond the responsibility of connection.
There is a linear check for already existing connections, and a test is put into place to make sure this is all properly caught.
17:13: Shortcuts are not yet supported in co #dagzet-rust
Adding TODOs for that.
I've implemented enough today.
19:22: Reading #read-elem-compsys #timelog:01:08:49
Finished up reading last bits in chapter 1. Read chapter 2 (shorter) on Boolean Arithmetic and their ALU.
I have not done any of the projects yet. I am hoping this can be one of those things I can return back to when I have a chunk of time available.
I want to believe that "just reading" without doing projects is "not nothing, not everything". That is to say, I hope I'm getting something worthwhile from this.
2024-07-16 Day 58. Impossible Day. (AGAIN??)
jump to top24 days left of batch. prev: gesture with events / unit tests for rust dagzet / bytebeat stuff part two
next: || impossible-day-2: impossible day, take 2: make a sound toy for the gameboy, and do it in assembly.
09:18: Morning Triage.
09:37: Impossible Day Goals #impossible-day-2
This time around, I am hoping to learn how to make some kind of musical instrument sound toy for the gameboy. I want to do it in assembly, and I want it to run on my analogue pocket. If I build a ROM that allows me to push buttons and make it actively control the APU somehow, I'd call that a win.
09:38: Publish
09:50: Get Spirtualized GB/GBC on Pocket #impossible-day-2
10:05: technical difficulties: clicking on roms doesn't load them #impossible-day-2
I click on it, and then they disappear.
10:09: Checking hierarchy
The bios I downloaded from retroarch was called gbbios.bin. Let's see if that works.
It works!
10:18: Curate some links. Scope. #impossible-day-2
10:28: Installing RGBDS #impossible-day-2
Works! -- Also tried some emulators. Does not work, yet.
11:12: mgba is available on alpine, but maybe no sound? #impossible-day-2
12:06: glitchy sound. sigh. #impossible-day-2
13:02: Mostly reading part 1 of the tutorial #impossible-day-2 #timelog:01:01:29
Also skimming a few other documents while doing this.
14:08: Maybe I can fix the audio buffer settings in mgba manually? #timelog:00:13:22
I can compile the source, let me see if the audio code can change the buffer size.
I did find the code. If anything, increasing the buffer size caused more issues. Ugh. Linux, amiright?
14:28: Trying to make the hello world make a sound. (nevermind) #impossible-day-2 #timelog:00:04:13
Actually, I'm going to do the part 2 tutorial now. I feel like I need more context than what the docs are giving me, and it would be nice to have something I know works.
14:33: Doing part of second tutorial #impossible-day-2 #timelog:00:27:15
Got it displaying stuff with a Duck.
Thing is, 16:30 is coming up quickly, and sound is more important than making a game. If I can figure out to just get a pulse working, that'll be a success.
15:11: Jumping back to sound #impossible-day-2 #timelog:01:14:10
Let's see if I can get an always on tone with unbricked, then I'll feel like I've accomplished something original.
15:33: Sound works in mgba, but not on openFPGA #impossible-day-2
15:48: Looking at GBDK C examples now #impossible-day-2
15:52: Trying the beep example. #impossible-day-2
15:57: Nope wow compiling GBDK is too cursed #impossible-day-2
16:09: I gotta see if this works on the actual cartridge or not #impossible-day-2
It does not.
19:40: More boolean algebra #read-elem-compsys #timelog:01:00:16
The last part of this chapter had the reader implement things like AND, OR, NOT from NAND. I could get NOT without too much trouble. I figured out an implementation of AND, sort of (NOT-NAND did not immediately come to mind. I was too busy trying to find patterns in all the cases). I struggle with OR, even after looking at the answers. It's clear I need to build up more intuition.
2024-07-15 Day 57. Event-driven Gesture, Dagzet, Testing in Rust
jump to top25 days left of batch. Prev: Wanted lines with sound / Built the necessary tools / Abandoned the Thought Next: || event-driven-gesture: initial work on event-driven gesture interface || dagzet-rust: more dagzet work, cuz I didn't get to it yesterday.
Very tired and groggy today. I've been going to bed too late, and waking up too early.
08:41: Morning Triage
08:42: After all that work yesterday, I didn't use any of it #linear-gesture-interface
I'll briefly try to defend the notion that it was mostly a good effort. For starters, I got very a convenient linear gesture interface, which required some moderate thinking about design in rust with ownership in mind. Secondly, I got Rust code working in mnolth for the first time, and this is important because it shows that I can leverage what I've done before to boost what I'm working on right now.
08:45: As it turns out, there was no time yesterday. #dagzet-rust
09:01: hoping to start work on event-driven gesture today #event-driven-gesture
Using the trait system in place, the problem is a matter of how to build the "next" gesture. I'm thinking of implementing an event queue, possibly fixed sized.
I think the plan is to have this work in a single-threaded context first, then building abstractions around that inside of an existing event system. (in a webaudio worklet, this would wrap around the "onmessage" passing interface used).
09:11: Publishing.
Zulip will be a re-cap of last week with a link to the weekly page.
09:50: dagzet in rust today #dagzet-rust #timelog:01:24:57
09:53: Beginning initial top-level struct #dagzet-rust
Eventually this will be populated with data from the commands.
10:02: Now would be a good time to figure out rust docstrings #dagzet-rust
Being able to add in-line descriptions of struct contents that get rendered to rust documentation would be helpful.
10:44: namespace and graph remarks mostly figured out #dagzet-rust
I have a incremental TDD approach to porting this, which feels nice.
10:45: Make use of =Option= to indicate uninitialized values #dagzet-rust
11:14: Better error handling, new node command incomplete #dagzet-rust
I'm making use of Result and putting return codes into a single Enum. This is just mirroring how I'd do it in C. Hopefully it's idiomatic enough in Rust.
13:49: Gesture event queue initial scaffolding #event-driven-gesture #timelog:01:46:56
14:27: What are the common verbs used to describe queueo operatoins? #event-driven-gesture
push/pop makes sense to me, but those are stack terms not queue terms.
Wikipedia tells me they use the terms "dequeue" and "enqueue".
Side note: my tired brain is having a great time parsing the "ueue"s of these words. Imagine if you pronounced it "wehweh". Hilarious.
14:59: Unions are unsafe. doh. #event-driven-gesture
I'm kludging it into a struct with optionals for now. This doesn't spark joy. I'm going to test this well, and when/if I refactor, I can rely on these tests.
15:11: Get tests working for pushing other event types #event-driven-gesture
15:15: interface is wrong. too much data allocation #event-driven-gesture
For enqueue, I should only be passing specific data types, and then mutating the internal event array. Right now, deep copying an entire event.
For dequeue, I should either return a reference or an id that eventually resolve to a reference.
15:49: More work on dagzet in rust #dagzet-rust #timelog:00:35:58
15:50: Let's run clippy on my dagzet project so far #dagzet-rust
Turns out it doesn't have a lot to say. Voxbox, on the other hand, is a mess.
15:52: cargo clippy is a mess on this repo #voxbox-cargo-clippy
I also think some of the examples are broken?
16:19: finished new node command, now lines. #dagzet-rust
16:32: Looking at bytebeat stuff with dan
17:09: pairing on visuals
019:00 home
20:25: back to dagzet #dagzet-rust #timelog:00:22:25
added a lines command. now I'm very tired.
2024-07-14 Day 56. Week 9. Linear Gesture Builder for mnolth
jump to top26 days left of batch.
Prev: A sine tone signal / writes itself to a buffer / after many hours
Next: || add-day-titles: First thing today. || linear-gesture-interface: build some code that makes it easier to build out linear gestures using mnolth || dagzet-rust: if there's time.
09:06: Morning triage
09:23: A couple notes on blip-buffer size #tic80-sine-tone
This is going to bother me if I don't write it down. I made a mention that having a buffer size of (sr/10) would effectively downsample the buffer by 10x, and this is not true. It's not true because that buffer is not filling in a second of audio, it's 1/60th of a second of audio.
Let it be known that many incorrect assumptions and guesses were made here. I leave them intact for historical reasons.
09:33: Indulging myself a little bit here #linear-gesture-interface
I've got a linear gesture interface that works pretty well in VoxBox, and I'd like to be able to use it with mnolth. I also want to be able to populate gesture from within lua. To do this, I'll need to build out an interface in rust, export a C interface, and write some mnolth glue code (mnodes).
mainly doing this so I can compose some of this audio telephone thing using mnolth. Frankly, the Uxn thing in GestVM is a bit cumbersome.... the setup time takes too long.
09:47: publishing
09:50: Let's add some titles to my logs. #add-day-titles #timelog:00:38:08
10:30: Clearly, I really do not want to do this. #LC75
10:52: Initial linear gesture interface work #linear-gesture-interface
First, I'll build out the initial interface in Rust, which will wrap the existing linear gesture node with a self contained vector of paths. I'll then write some C-style functions that make it possible to do the following: initialize, append, tick, and destroy. I will make a quick example of this in Rust, which will be a good stopping point.
The next parts involve actually getting this inside of mnolth/mnodes, and creating the glue code there.
10:57: Initial scaffolding #linear-gesture-interface #timelog:00:26:58
11:20: Creating an int to behavior wrapper #linear-gesture-interface #timelog:00:35:38
Using this as an opportunity to learn how to build a function with proper error handling and testing. I was going to do a more elaborte check validating each "okay" one, but it was less messy just to see it catch errors.
12:01: Build an initial example using interface. #linear-gesture-interface #timelog:01:16:06
12:11: Running into borrow-checker woes #linear-gesture-interface
12:18: More borrow-checker woes. I am thinking of this problem wrong #linear-gesture-interface
Instead of building on the existing LinearGesture Generator, build a new gesture interface that owns a vector that can grow.
12:29: Things work now. That's enough for now. #linear-gesture-interface #timelog:00:27:52
Going to work on the C interface next.
13:59: Attempting the C interface #linear-gesture-interface
14:47: Using C interface with Rust example. #linear-gesture-interface
I think I can do that right?
14:50: We cannot make this both a crate and a library, but can we? #linear-gesture-interface
I did a bit of a hack, which was to duplicate "src/lib.rs", make it "src/clib.rs", and then add an example in the Cargo toml file.
15:14: Time to get this in mnodes. #linear-gesture-interface
16:38: Sidetracked. This will have to wait. #linear-gesture-interface
18:44: Back on this... #linear-gesture-interface #timelog:00:53:38
Most of the day has been wasted away. Going to try to move quickly on this.
19:40: Proof of concept done. Good enough for tonight? #linear-gesture-interface
Gotta compose now.
2024-07-13 Day 55. A sine in tic80. Finally.
jump to top27 days left of batch.
Prev: some zettelkastens / directed acyclic graphs, / bytebeats, and trios
Next: || tic80-sine-tone: didn't get to this yesterday, first thing on my TODO today. || add-day-titles: A good day to think about meaningful titles for my previous logs. || dagzet-rust: More dagzet porting.
12:02: Noon Triage
Meant to do a weekly re-cap end of day, but I got caught up in an ad-hoc bytebeat coding session.
12:35: finished up summarizing week 8
I still need to write up yesterday though...
13:51: Sidetracked: Learned about a new dumpling place
Looks good. Gotta go back there and actually get something. Also got caught up in conversation with JM.
13:54: I gotta finish this triage up lol.
I am here to be interrupted.
14:09: Let's try to get back to this tic80 things #tic80-sine-tone #timelog:01:34:09
14:11: I need to better understand the timing relationships #tic80-sine-tone
I'm not entirely convinced that DSP works at a constant samplerate, because I think the timing is controlled from the drawing function.
My belief right now is that the audio callback works by taking any samples produced since the last time it was set, and then stretching it out to fit the required number of samples needed for the host audio callback.
14:19: "opaqueness" these pointers are very opaque #tic80-sine-tone
I was trying to describe my issue with this codebase. What makes this impenetrable is that is that the use of macros makes it very opaque. I can't simply grep or use ctags to find definitions in the codebase.
Take, for instance, this line (breaks and indentation my own):
14:36: printf-ing some constants #tic80-sine-tone
The output of blip_read_samples
is reading a pretty constant 735 samples at a time, which is the number of samples read at 60fps. So it is actually working at the host audio rate. My timing theory is feeling a little shaky.
TIC80_FRAMERATE
is hard coded to be 60, so it's filling the max amount. Based on the READMEs, I think it doesn't have to be? It seems like the blip_buf
library is designed to discretize sound chips that don't necessarily have a clock. You just note the times when amplitude changes, and then it turns that into a buffer of audio samples.
14:53: =blip_end_frame= is weird. #tic80-sine-tone
For starters: there is no blip_start_frame
. A call to blip_end_frame
implicitly begins a new frame.
When it is called, a constant called END_TIME
is used.
The END_TIME
is defined as CLOCKRATE / TIC80_FRAMERATE
. The framerate I already knows being hardcoded as 60. What is CLOCKRATE
, and where is it defined?
CLOCKRATE
is defined as (255<<13)
in core.h
, or 2088960. I do not understand the signifigance of this value. Going to try to look up that number, as well as the hex version of 0x1fe000
.
I put it into chatGPT, this is what it said:
15:18: =ENDTIME=, how is it used in =sound.c=? #tic80-sine-tone
It's used in runPcm
, there's some math done so that the loop works on a fixed PCM buffer size of TIC_PCM_SIZE
, which is 128, but the incrementor is in these "clock units", not sample units.
15:27: tic80 audio is weird because =blip-buf= is weird #tic80-sine-tone
15:34: When/where is =update_amp= called? #tic80-sine-tone
Whenever this is called, a new delta is added to the blip buffer (delta-encoding). This seems to basically mean, "add a sample to the resampling buffer".
called in runPcm
. It seems to be writing PCM data (128 samples) to the blip buffer.
called in runEnvelope
and runNoise
. There's a similar pattern to both of these, where a "period" is determined from some frequency value.
16:28: Another attempt. #tic80-sine-tone #timelog:02:07:33
16:29: Let's isolate where those opening blips are coming from #tic80-sine-tone
In the sfx
callback, which is in tic_core_sound_tick_start
.
16:38: runPcm isn't doing anything. #tic80-sine-tone
when I comment the call runPcm
, sound still works. This must some way to get PCM data in like sample playback stuff.
16:41: Making test noise now. #tic80-sine-tone
If my intuition is understanding this correctly, I think this program takes in a pile of delta time values, and turns that into a workable buffer? I have two optional things to try. One could be: write new deltas directly to the blip. Another could be, write to the existing PCM channel.
I got noise working! But, It's stereo noise, and I was expecting mono noise. I forgot that stereo synthesize is called twice, once for each channel.
Noise is working! It is being written to the ringbuf PCM data before calls to stereo_synthesize
are made. I'm noticing gaps in the noise when I go between the editor and the console.
17:03: now, to make it a square. #tic80-sine-tone
I get the sense that I might not be handling the data type correctly. It might be short (16-bit) ints, not 8-bit ones. A pitched square will for sure be able to figure it out.
The pitch is wrong. I'm assuming a 44.1 samplerate in the square calculation. But the blip-buf might be targetting another samplerate like 8kHz or something.
17:18: trying to understand how blip initialization works #tic80-sine-tone
So the function blip_set_rates
is definitely being set to be the host sample rate of 44.1kHz, however, the blip buffer gets initialized to be only a 10th of that size, making it effectively ~4khz max. Right? And that's assuming 1 channel. There are 4 channels that need to share that buffer, so that's like 1kHz?
17:26: Trying the square the blip-buf way. #tic80-sine-tone
Okay. I'm fighting blip-buf itself somehow? I don't want to waste any more time trying to figure that out.
17:44: Writing a square after blip-buf #tic80-sine-tone
17:59: Finally found the tic80 struct? #tic80-sine-tone
It wasn't opaque, I just wasn't looking in the top level directory. It's in include/tic80.h
. Wow. That's embarassing.
18:14: What could be causing the crackling? #tic80-sine-tone
My square shouldn't be crackling. I refuse to believe that it's buffer xrun. Sure does sound like it though.
18:17: zeroing out the buffer should have caused the blip to be off #tic80-sine-tone
This is a clue.
18:25: Good lord we have a square now. #tic80-sine-tone
I just messed up the logic for filling up interleaved audio.
18:26: Getting ready to make my damn sine. #tic80-sine-tone
18:30: sine tone created. it is done now. #tic80-sine-tone
2024-07-12 Day 54. Trios, triads, and bytebeats.
jump to top28 days left of batch.
Prev: More sine tone attempts / More singing drawing tablets / All you need is NAND
Next: || compeng-resources import CE links into knowledge graph. || dagzet-rust Try to get to this today? || tic80-sine-tone: timebox this today, see if I can make more progress. || demo-trio: initial code, get chords working.
08:13: Morning triage.
08:14: Did not get to this yesterday. Today maybe? #dagzet-rust
08:44: I think I nerfed the experience with quantization #drawing-tablet-demo
I think I might have nerfed the experience with the pitch quanitization to be honest. It seems like the people I showed the quantized version to used it for less time compared to when it was "fretless". People really enjoyed the fluidic expressiveness of the pitch control and getting it to emote things using just inflection. Food for thought for another demo?
08:55: This would be a good weekend project #add-day-titles
08:56: I have still not started this. Sigh. #LC75
08:57: Don't know if this is worth my time to actually read right now #react-escape-hatches
08:58: This concept feels pretty fleshed out #concept-concerto #demo-trio
I think I might call it "trio", and the elevator pitch is that you control one voice in a trio, and there is an algorithm that controls the other two pitches to form 3 part harmony.
Follow-up task created: demo-trio.
08:59: First step in trio: get instantaneous chords #demo-trio
09:02: Publishing
09:16: initial thoughts #dagzet-rust #timelog:00:18:23
My hope is that the lua implementatoin is trivial enough that I can bring it over to rust without too many complications. I will outline some of the broad strokes steps required to get this program up and running.
While writing these thoughts
The first thing I'll need to be able to do is read a file from disk, possibly entirely into memory. The dagzet parser works line by by line, so if there's some iterator abstraction that allows me be to do this, great.
For parsing commands, I need to be able to read the first three characters of each line to determine the command code. Commands are two characters followed by a space, with the rest of the line being being arguments.
Commands need to map to functions which can parse the argument data of that line, and potentially append or modifiy a rust data struct representing the graph being built up somehow. But, just setting up a convenient way to map commands to functions would be great. There will be quite a few, and I often find myself wanting to add more commands to meet my needs. I made use of Lua tables to create a look-up table of callbacks. Hopefully I can do a similar thing in Rust without too much fuss?
I need to be able to split data up by spaces. Lua does not have a built-in split()
like you'd see in other languages. I'm hoping Rust standard library has one somewhere (this functionality should be standard not outsourced to a crate right? right?!)
I need a top-level data struct that can be populated with information that is parsed. I the lua implementation, I used tables: usually as a hashmap or array structure, sometimes an array of array. IIRC Rust STD has hashmaps, and vectors I think should be enough. I can't foresee too many ownership issues due to how imperative this is, but who knows.
I need to implement topological sort (Kahn's algorithm). My lua approach used node IDs instead of references, so I think this is going to be mostly Rust-friendly to port. Still, I get the feeling that I may be forgetting something that will be a pain point with Rust.
When it has been determined that the graph contains no cycles, generate the SQLite code to standard output. This seems pretty straight forward to me, more or less going through the generated struct (read-only) and printing equivalent SQLite code.
09:45: Starting initial boilerplate code. #dagzet-rust #timelog:01:07:29
09:46: get it to read lines of a file #dagzet-rust
Going to try to use neovim for this now.
tangent: trying to get auto-import working. Found: <<neovim/nvim_cmp>>.
how to get this working with lazy, README only has vim-plug?
Wait, it's already installed, according to the Lazy control panel. What does InsertEnter mean?
Okay, I get the recommendations, but I don't know how to insert it.
Got it! Typing "File" then hitting ctrl-y does it.
10:19: haha. flopping around with rust compiler on trivial things. nailing it. #dagzet-rust
10:24: give it a file on the command line #dagzet-rust
10:39: Parse lines, find their command code. #dagzet-rust
10:51: That's some good enough boilerplate. #dagzet-rust
I have it parsing command codes in the test file, and there is some placeholder stuff where I can eventually do stuff with those commands.
12:15: Back at hub. Setting myself up.
I got a nice big screen here. Looks nice.
12:21: Initial boilerplate code, maybe some chords #demo-trio #timelog:01:33:13
12:22: make copy of the singer test. #demo-trio
12:36: getting the pitch control happening in rust instead of C #demo-trio
12:51: time for initial chord logic? #demo-trio
For now: given a pitch (octave-wrapped, 0-11), find the corresponding triad. Use a lookup table (vector, probably).
12:54: consolidate voice into struct #demo-trio
13:05: two intial instances #demo-trio
13:16: initial chord look-up table #demo-trio
13:51: initial chord logic works! #demo-trio
I'm going to need to rework this to make it work with gesture and chords, but this is a good start.
13:56: changing the smooth times makes a HUGE difference wow #demo-trio
16:43: Fell into pairing all day.
Side note: the LED sign is 96x38
18:21: Wrapping up
2024-07-11 Day 53. All you need is NAND.
jump to top29 days left of batch.
Prev: I truly do not / understand how the sound works / inside TIC-80
Next: || poke-sound-warning: shoot I forgot to do this one yesterday. || tic80-sine-tone: Not completely out of ideas yet || drawing-tablet-demo: time to transition from experiment to demo || dagzet-rust: peak at this, if there is time?
08:28: This is not worth any more of my time.
Considering how few of these I'm realistically going to get to, it's going to be easier to just manually print them to PDF to get them on my RM. Solid attempt though, had a good adventure.
08:32: Ooops forgot. First thing today #poke-sound-warning
08:33: I have a few more ideas on how to do this #tic80-sine-tone
Look at tic_core_sound_tick_start
and tic_core_sound_tick_end
.
08:35: Officially concluded. #drawing-tablet-experiments
Now onto demo
08:37: Drawing tablet demo: mono singing voice #drawing-tablet-demo
Going to try and timebox this to just today. What I want is to get pitch snapping on x-axis, reverb and delay effects, as well as y-axis intensity control.
08:39: Some initial boilerplate rust code would be nice to set up today if there's time #dagzet-rust
08:40: Publishing
08:51: Gotta add a sound warning to poke. #poke-sound-warning #timelog:00:04:57
09:01: Another stab at this tic80 sound stuff #tic80-sine-tone
09:06: looking at ringbuf in tick end function #tic80-sine-tone
It seems like there's a ring buffer of registers, and a register holds the data for one frame of audio. A frame consists of 4 channels of sound. Each channel is represented as a 32-bit signed integer.
When the end tick function is called, it gets the current register in the ring buffer, copies the register information in memory (RAM) to that ring buffer, then copies the "pcm" and "stereo" data bits as well. The ring buffer head is then incremented, but it's inside some conditional. I can't fully grok the conditional, but it seems to be some kind of bounds checking? I'm going to ignore it mostly for now.
09:18: Flipping back to =studio_sound= again. Does it get called? #tic80-sine-tone
Yup, it's getting called.
09:22: Back to sokol. Are we in sokol? #tic80-sine-tone
No. No we are not in Sokol.
09:26: We are in SDL! #tic80-sine-tone
I've zoomed up to the top-most callback level which looks familiar enough to me. Still, it's doing stuff in a less than straight-forward way than usual.
This is the code worth digging into a bit, which is responsible for copying TIC-80 PCM data into SDL's audio buffer. I have reformatted it a bit to make it more readable. Initially, it was all on one line.
*stream++ =
((u8*)tic->product.samples.buffer[
tic->product.samples.count *
TIC80_SAMPLESIZE -
platform.audio.bufferRemaining--
];
The stream
variable here is a chunk of u8 vars.
The TIC80_SAMPLESIZE
ends up being 2. This could either mean it's a stereo frame of 8-bit values, or the sample type is 16-bit and the samples are interleaved if it's stereo. Unclear.
The bufferRemaining
variable is decreasing by 1 every sample, and being subtracted from the total number of bytes that the buffer has. This feels like a bit of round-a-about way to avoid using a for loop structure.
When the audio buffer counter goes to zero, it fills up the tic80 buffer again using studio_sound
. It continues doing this until the SDL buffer is filled.
09:50: =tic_core_synth_sound= updates tail #tic80-sine-tone
Meanwhile, tic_core_synth_head
updates the head. The relevant bits of code look almost identical.
09:53: Just printed tick start/ends and sound synth #tic80-sine-tone
There's some kind of async stuff happening I think. A tick start always follows up with a tick end, so that's being called on one thread. Meanwhile, the synth sound function can be called a few times in between these ticks. Sometimes once or twice, or not at all.
09:56: Where are the tick start/end functions being called? #tic80-sine-tone
There's mention of a tic80_tick
function, but when I try to printf from it, nothing shows up.
Okay found it! inside gpuTick()
from SDL, the call chain is studio_tick
, renderStudio
, then calls to tic_core_tick_start
and tic_core_tick_end
, which makes calls to the sound tick starts and ends.
10:10: That's enough for now #tic80-sine-tone #timelog:01:13:35
It's looking to be an interesting sound problem, because the timing of sound rendering is related to the drawing callback. There's things like jitter and frame-drops. How does it handle that sort of thing? Does the so-called "blip buffer" smooth things out somehow?
11:00: Chat with Sonali
12:47: Time to start drawing tablet demo #drawing-tablet-demo #timelog:01:28:59
13:53: Some light refactoring #drawing-tablet-demo
14:20: Lots of very welcome distractions #drawing-tablet-demo
But we are back. Attempting to clean up main a little bit. Starting with the arg parsing.
14:42: Add an interrupt signal. #drawing-tablet-demo
It keeps screwing up my terminal window when I ctrl-C.
14:46: Move out tablet control to separate file. #drawing-tablet-demo
15:10: Make the pitch snap #drawing-tablet-demo
16:00: Presentations / Mini Dinner
17:41: Reading: boolean algebra #read-elem-compsys #timelog:01:30:42
2024-07-10 Day 52. I do not understand how tic80 sound works.
jump to top30 days left of batch.
Prev: Drawing tablets. lots / of singing drawing tablets. / so many tablets
Next: || tic80-sine-tone: begin initial work on this || drawing-tablet-experiments: get new tablet working || drawing-tablet-demo: start building out the singing demo.
09:38: Sending this off. #resume-setup
10:47: Late morning triage
11:11: This feels complete #demo-react-UI #voxsculpt-preset-manager
I don't have preset import/export, but that can be another task.
11:14: This would be a good day to try to work on this #tic80-sine-tone
11:16: Singing tablet demo features to add #drawing-tablet-demo #timelog:00:56:06
I want pitch quantiziation, and a y-axis controlling intensity
11:36: Publishing.
12:02: Trying out my new x-pen #drawing-tablet-experiments
12:06: It works #drawing-tablet-experiments
32767 both directions. which potentially poses a problem if you want the aspect ratio.
Just measured. 4x3 inches. so 4:3 ratio.
In theory, it's 15 bit resolution, which is great considering the size. But is it actually 15-bit? or just upsampled.
12:11: udev rules to use without sudo #drawing-tablet-experiments
Also wondering: how to get max resolution automatically of touchpad?
Ah, yes it does <<misc/libevdev_get_abs_maximum>>
12:20: Disabling the tablet from sway. #drawing-tablet-experiments
Note to self: I gotta organize this script better.
12:24: Hook-up to singer #drawing-tablet-experiments
12:38: Tongue shape control. #drawing-tablet-experiments
14:00: Lunch.
14:45: Creative coding show and tell
15:20: Attempts to get a sine tone playing in tic80 #tic80-sine-tone #timelog:01:19:56
15:24: =update_amp()=, what does that do? #tic80-sine-tone
Ah, I don't think I have to care about that, I can process data directly.
16:22: runPcm not working. going a level down. #tic80-sine-tone
16:26: How does =tic_tool_noise= generate sound?
Wait, runNoise
is what I want.
16:55: Wow, I am having a hard time grokking this code #tic80-sine-tone
I don't understand what update_amp
is doing. It seems to be some kind of delta encoding.
I'm just putting printf statements in functions now to see what runs and happens.
17:39: the =tic_sound_register_data= is key #tic80-sine-tone
It can be found in core.h
. The value I think that is relevant for an arbitrary sine tone is "amp", which is the "current amp in the delta buffer". Does that mean the amp is delta encoded or PCM encoded?
There are 4 channels of sound.
17:45: Follow-ups #tic80-sine-tone
How can I print any non-zero sound PCM values to terminal?
What's the deal with registers? How do they end up in a PCM buffer?
18:05: brainstorming #concept-concerto #timelog:00:05:53
18:15: brainstorming #concept-concerto #timelog:00:14:53
2024-07-09 Day 51. Singing Drawing Tablets
jump to top31 days left of batch.
prev: tic80: buffers? / staggered voice algorithm, / wacom tablet fun
Next: || drawing-tablet-experiments: Build a proof of concept voice-theremin, precursor to interface work for drawing-tablet-demo mechanics. || demo-react-UI WebAudio work. || LC75: make PDF versions so I can read on RM tablet. || consider-poke-follow-up: please write some tasks. I think I want to push myself to iterate.
07:51: Morning Triage.
08:02: Escape hatches next on the React reading list #react-escape-hatches
08:16: have this work for wacom and xp-pen #drawing-tablet-demo
I have access to a Wacom tablet now, and I kind of love the size of it.
08:19: Didn't get to this yesterday, today hopefully #demo-react-UI
08:19: Initial investigation completed, now onto initial challenge: sine tone test. #investigate-tic80-audio #tic80-sine-tone
I think I know about enough to get started on hacking in a sine tone into the TIC-80 sound codebase.
A sine tone is ideal because it's a pitched pure tone, and it can be easy to tell if I'm getting the buffering and sample rate correct. If I can get a sine.
08:22: Drawing tablet experiment vs demo #drawing-tablet-demo #drawing-tablet-experiments
There is a bit of overlap here. I want the end of the "drawing tablet experiments" to have sound, and the "drawing tablet demo" to build up from that initial demo, tuning the sound better, and adding an abstraction that will make it easy to go between tablets.
08:24: Did not get to this yesterday. #LC75
In the true yak-shaving tradition, I want to build a quick webscraper that takes all the links in the LC75 page and then prints them to PDF. From there, I can view them on my remarkable, so I can be a way from my computer and study them with pen and paper.
08:27: Write some tasks today please. #consider-poke-follow-up
I think it would be good for me to iterate on this. I'm getting great feedback.
08:28: Pokey fella needs to laugh #consider-poke-follow-up
08:30: There needs to be a sound warning #consider-poke-follow-up #poke-sound-warning
Someone yesterday got very surprised by the sound and immediately closed the browser. Oops.
08:39: Publish
09:00: Some experiments with the wacom. #drawing-tablet-experiments #timelog:00:36:51
09:01: What model is this Wacom Tablet? #drawing-tablet-experiments
Wacom Intuos PT M 2, according to dmesg. Researching.
More specifically: CTH-690 [Intuos Art (M)] (using product/vendor ID lookup).
Super cheap to buy on ebay.
09:25: I have to learn about the finger tracking information here #drawing-tablet-experiments
Is it multi-touch?
09:39: ABS_MT_SLOT keeps track of multi-touch points I think. #drawing-tablet-experiments
I'm counting 10 points multitouch.
X axis resolution: ~2143 (2150?), Y axis resolution: ~1333
09:49: Time to make some noise. #drawing-tablet-experiments #timelog:01:07:40
09:54: gotta build libsoundio on this device to get jack backend #drawing-tablet-experiments
09:57: voxboxOSC builds and runs. now to extract the relevant bits. #drawing-tablet-experiments
No need for the OSC bits, but the sound and DSP are a good start.
10:13: time to get the code tablet hooked up #drawing-tablet-experiments
udev permissions are in order.
10:33: Only set pitch when pen is down. #drawing-tablet-experiments
10:47: It works. Gotta place with this thing now. #drawing-tablet-experiments
10:54: Adding files and committing. #drawing-tablet-experiments
11:40: Experimenting with tablets again
Tried to an xp-pen tablet working that Seamus had. Could not use evdev to parse the tablet events for whatever reason, but I could read the raw bytes using cat and xxd.
12:00: Audio Hang
13:00: Lunch
13:42: Now what?
13:44: Initial work on web scraper for LC75 #LC75-scraper #timelog:01:11:40
14:00: wget doesn't work, curl doesn't work, selenium hangs #LC75-scraper
14:05: trying wget again #LC75-scraper
14:13: I just copy-pasted from the view source in browser #LC75-scraper
It had the data embedded in there from a JSON blob. I was able to extract it in Vim. Currently jq-ing it.
Here is my big query:
jq .props.pageProps.dehydratedState.queries[0].state.data.studyPlanV2Detail.planSubGroups data_pretty.json
14:34: Picking apart LC75.json data #LC75-scraper
It's an array of items, split up by category (of which there are 22 categories). Each entry is an object with a questionNum
field. When you add all the questionNum
s together, you get 75.
The key I want for the URL is titleSlug
, and the id is questionFrontendId
for the LC (I can use this to reference the problem tersely online). These are objects found in the "questions" objects, aka .[0].questions[0]
.
14:44: Headless print on linux? #LC75-scraper
It's not going to matter.
14:49: I have hit a security wall #LC75-scraper
14:57: Can use wayback machine with limited success #LC75-scraper
Given a leetcode article, I can find a cached version on wayback machine, which I can then curl to an HTML file. Most of the HTML can be stripped using w3m -dump
.
15:27: Does wayback machine have an API for getting cached pages? #LC75-scraper #timelog:00:15:48
It does! There's a URL you can query and get a JSON response from. It does produce HTML, but, even with w3m it does seem to be kinda messy. Damn.
16:00: making some follow-up tasks for poke. #consider-poke-follow-up #timelog:00:07:15
16:01: Sounds events should happen when there's a collision #consider-poke-follow-up #poke-collision-sounds
16:03: successive pokes should build up energy #poke-evolving-sounds #poke-evolving-sounds
For example, my "balloon" control signal generator could probably work here.
16:07: poke pellets: drop things to have the fella eat #consider-poke-follow-up #poke-pellets
16:08: Implement laughter and chitter sounds #consider-poke-follow-up #poke-laughter-chitter
16:18: More webaudio code porting #demo-react-UI #timelog:00:35:00
Note to self: we're calling this voxsculpt.
16:52: Mostly works. Now the rest. #demo-react-UI
16:53: "User aborted request" #demo-react-UI
Happens when I attempt "await startAudio" in onclick callback.
16:58: Sound is made, but sliders do not control anything. #demo-react-UI #timelog:01:10:44
Also paired with Dan to help out with things.
19:04: Troubleshooting xp-pen tablet. #drawing-tablet-experiments #timelog:00:23:34
2024-07-08 Day 50. Week 8. More TIC80 code exploration, unexpected Wacom Tablet, staggered voice algorithm
jump to top32 days left of batch.
prev: TIC-80 Sound Code / Drawing tablets for music? / Planning next demo.
next: || LC75: Begin leetcode today I guess? || investigate-tic80-audio: Get a better sense how the audio callback works, then create follow-ups. || demo-react-UI: continue getting WebAudio code ported to typescript. || resume-setup: Finishing touches I swear. || consider-poke-follow-up: Create some feasible tasks today.
Ears have been quite blocked this weekend. Still blocked today.
08:03: Morning triage
08:15: Leetcode task created. #LC75
08:17: Try to get project links working today #resume-setup
Timebox it to about 30 minutes. Give it one last look, then send it to Jobs Chat.
08:18: Downgrading shape morphing #voxbox-shape-morphing
The approach here isn't interactive enough, and I think I need to keep focusing on interaction. I'd rather have one mediocre vowel shape and interaction than several excellent vowel shapes and no interaction.
08:27: GOing to try to make some follow-up poke tasks today #consider-poke-follow-up
08:29: Been thinking a lot about the mechanics of this over the last few days #concept-concerto
Things involving Gesture mechanics. Specifically, the concept of getting some kind of "auto-accompaniment" to work with Gesture signal generator systems. Something something event scheduler. Might be good to write some things down.
08:32: Writing better error handling for gendb
09:01: Phew. Okay. Now, we can publish.
09:10: TIC-80 investigations part 2. #investigate-tic80-audio #timelog:01:017:34
09:19: 8-bit audio. #investigate-tic80-audio
I shouldn't be terribly surprised by this, but the output is 8-bit.
09:24: 8khz 8-bit singing vocals don't sound too terrible
Tested on my linear gesture demo using SoX. The darker sound actually works favorably for this synthesizer.
Converting to LPC10 (via sox), however, sucks. The pitch is wrong.
09:48: second-guessing myself on the 8-bit depth #investigate-tic80-audio
It could be that that's just a raw byte format. The sound code here is fairly involved. I'm going to need to look at this more.
09:50: Unclear what is a buffer and what is a tick #investigate-tic80-audio
It is unclear to me when things are computing a single sample of audio (tick), and when it's writing to a buffer.
For example, SOKOL seems to be writing shorts (16-bit).
What I do know is that samples eventually show up in something called "product->samples.buffer", but I cannot find that definition yet (there's a lot of macro magic in this repo). There is also a buffer count too. It appears that these could be interleaved samples.
10:20: So much is implicitly defined #investigate-tic80-audio
As far as I can tell, product.samples
is dynamically generated. It's not actually anywhere in a struct. The tic80
stuct I have yet to actually find. Same goes for product.samples.buffer
and product.samples.count
.
There are two top audio callbacks that tell tic80 to generate soud (how much sound not sure? but it's definitely multi-channel). I'm looking at the one sokol uses called studio_sound
. This calls tic_core_synth_sound()
, which I assume writes to product.samples.buffer
somehow. If I can inject an effects processor right after the buffer is fully written, I think that can be enough for me.
10:30: =runPcm()= and =stereo_synthesize()= are things to look at #investigate-tic80-audio
These calls seem to happen right before they are resampled.
11:00: Lunch
12:00: Talk with Chirag
13:00: C Creatures
14:38: Play with Carsten's drawing tablet #drawing-tablet-experiments
Can it work on linux okay? I can I use libevdev with it?
14:40: It works out of the box #drawing-tablet-experiments
14:45: Oh wow, you can use this one as a touch pad too!
It even does touch to click this is incredible.
14:51: Working through some evdev sample code I found online
15:13: Initial polling program working! #drawing-tablet-experiments
X max: 21600, Y max: 13500, Physical units: 8.5 x 6 inches,
Along the Y-axis, this evenly comes too 2250 pixels/inch. The X axis is about 2541.1764. Huh.
15:24: How to turn off wacom touchpad (in sway)?
15:49: After some googling and JQ-ing, got both pen and finger disabled #timelog:01:09:42
They still work in my test poller.
15:59: Quick attempt at project links #resume-setup #timelog:00:20:06
I think I already have URL code in the TeX code?
oops maybe not
16:27: React Reading: Scaling up with Reducer and Context #react-managing-state #timelog:00:20:48
This concludes the "managing state" chapter.
18:42: Some "staggered voice" system design planning #concept-concerto #timelog:00:32:48
2024-07-07 Day 49. Mini drawing tablets, TIC-80 audio code
jump to top33 days left of batch.
prev: kickstart neovim / slider groups typescript react / rust-analyzer
next: || resume-setup: Add data points for projects. || consider-drawing-tablet: potentially buy a small drawing tablet to play with. || investigate-tic80-audio: look into tic80 audio code and music player.
Right arm is a little sore today, probably from typing too much yesterday. Going to lay off a little bit today.
09:21: Morning Triage
09:40: Data for projects, then done? #resume-setup
09:41: Impulsive idea to buy a drawing tablet today #consider-drawing-tablet
I've been on and off about this for a few days. I think it could be a good interface to try and tune vocal synth parameters. But, I should make a decision by end of day.
Thinking about using this with my number pad, somehow. I like the minimalism of the numpad, and consider it an interesting interface challenge for the task of musical sequencing. The tablet could be a good way to get continuous fine tuned gestural control.
Performance interface too?
09:48: want to examine audio code and capabilities in tic80 today #investigate-tic80-audio
How hackable is it? Is it feasible to shove my rust DSP code into the TIC80, audio callback, and control it from tic80 code?
I think this could be a fruitful way to build an ecosystem of toys that match my current aesthetic. I have semi-long term plans for this potentially.
I say "semi-potentially" because for its constraints, the software has quite a complex software stack. I don't expect this to be the most portable thing in the world. The most portable things are ideas, and after that, bytes of data (8 bits to a byte is fairly universal). There is a cartridge format, which may end up being a promising data format for portability, but we'll see.
Short term (as in this batch), portability or longevity is not a concern. I just want to hack stuff.
09:56: Publish!
10:00: taking a walk, thinking about tablets and interfaces and other such things #consider-drawing-tablet
11:01: Tablet and mousepad purchased. #consider-drawing-tablet
During my walk, I realized this was a perfect device for exploring vocal synthesis control. A sort of gold standard I could use to try out interface concepts before porting to the web.
I'm starting to see building instruments for the web as "redux", as far as solo interactive musical experiences go. It's incredibly convenient, but ultimately the medium of the browser is still lackluster, and usually there are tradeoffs.
Future thinking: collaboration and share-ability are the big strengths of the web. Something I haven't really explored yet. Maybe that's something to think about?
11:10: Concerto: Vocal ensemble control is still first-person experiences #concept-concerto
I want to move forward with an ensemble demo, but coming up with the interface has been a blocker for me. I don't want to blindly recreate Blob Opera or Choir. It's been hard to see past that. How do you meaningfully control multiple voices?
Then I realized, it was never about controlling multiple voices in Blob Opera or Choir. You were always controlling one voice at a time, and the rest would follow. A leader-follower, or soloist-orchestra relationship. Such a thing I will call a "concerto" format.
11:17: drawing tablet demo needs to be realized now #drawing-tablet-demo
The initial proof-of-concept should ideally be flexible and modular. I'll want to prototype a few instruments with this thing. The inputs will be the 10 keys on my numberpad, and the XYZ inputs from the drawing pad. Realtime audio, etc.
14:07: Examining TIC-80 codebase for audio #investigate-tic80-audio #timelog:00:21:25
The idea is to find the files, then add them to codestudy.
I think I found the music. One file sound.c
, almost 600 lines. Not too bad, actually.
Adding to codestudy.
why didn't sound.c import?
oh it did, but the CSS broke. why did it break? Gotta look into this. again.
14:28: gotta see why CSS for this broke. #investigate-css-codestudy #timelog:00:44:46
Confirmed broken on my website as well.
git blame says CSS has been unchanged since it was initially added.
demos CSS still works fine.
Huh. so the file works on a smaller portion of sound.c.
Looks like an overflow bug.
15:15: Overflow error confirmed. Deeper investigation. #investigate-css-codestudy #timelog:00:07:30
The issue is, I don't know how this CSS works. It was just lifted from SourceHut.
It seems to be a flexbox attribute. I changed flex-wrap
to be nowrap
and the problem went away.
15:23: Back to tic80 #investigate-tic80-audio #timelog:00:49:55
15:42: What things call these functions? #investigate-tic80-audio
I want to see an audio callback.
17:01: Tasking.
17:06: Next steps #investigate-tic80-audio
I think I'm pretty close to understanding how the callback stores audio into buffers. There's a resampling step which I need to figure out more, but I believe I know where I can inject custom DSP. At that point, I'll be ready to make a follow-up task.
17:08: Adding project details to resume #resume-setup #timelog:00:30:32
17:41: Project links maybe? #resume-setup
This is looking pretty good though.
19:21: React: passing state deeply with context. #react-managing-state #timelog:00:31:46
2024-07-06 Day 48. Setting up NeoVim with Kickstart
jump to top34 days left of batch.
Prev: poking propulsion / prototype published. ponder: / participants pleased?
Next: || investigate-kickstart-neovim Try to get to this today. || resume-setup Create an initial TeX backend for my resume generator. || demo-react-UI Slider groups, all the parameters.
11:20: (Late) Morning Triage
11:27: initial demo done. Follow-up? #demo-poke #consider-poke-follow-up
I've gotten good feedback from people, and was really interested in watching people interact with it. It's apparently has startled many cats! I may want to scope out some follow-up improvemnts to this, if there's time. Some things include: wall collision sounds, more dynamic interactions when poking, more variation in sound, and more empathy mechanics (feeding, laughter/tickling, etc).
11:33: final week7 career thoughts. #what-now-week7
It's been nice getting most of my resume structured as data. It's very "music tech" oriented, with a strong emphasis on DSP and "low-level" languages like C/C++. For some reason, I didn't expect for a narrative like that to so strongly emerge. I could probably find enjoyment in "systems" oriented things: studying databases, OSes, computer architecture, etc. Rust seems like a good way to stay relevant in these fields.
I think the leetcode is going to resume next week. If I can find time every day, I could do about 30 leetcode problems before this batch is over. But, leetcode takes more time than people think. An hour of focused time is quite valuable. If I can work my way up to 2 every day, that's most of LC-75 (80%). If I do 2 every other day, and at least one every day, that's about 45, or 60% of LC-75.
I want to get typescript under my belt. The handbook <<webdev/typescript_handbook>> is a little under 40 pages, so probably something I could feasibly read. Then again, maybe it's just good enough to throw up some sliders in React and just bump around typescript that way? How much of my time is it worth?
Maybe if I can get neovim up and running, I think I may have a better time doing webdev stuff and feeling "modern"? (He says while typing in a very vanilla vim editor in termux running on Alpine linux using vintage IBM colors.)
But damn, get this resume built up and send it off to Jobs Chat already jeez.
12:18: Publishing
12:20: Let's see how easy this "kickstart" thing is. #investigate-kickstart-neovim #timelog:00:51:20
12:41: It seems to be working. Some sort of LSP is installed I think, but I want autocomplete #investigate-kickstart-neovim
12:42: Attempting to install YouCompleteMe #investigate-kickstart-neovim
12:45: Wait wait, I think there's another way to get rust analyzer working using nvim-lspconfig #investigate-kickstart-neovim
13:02: Okay it works. Kinda neat. Trying to get typescript set up #investigate-kickstart-neovim
It works!
13:14: epilogue #investigate-kickstart-neovim #timelog:00:05:40
Key things I did to get things working:
Clone the kickstart repo to ~/.config/nvim
. That is to say, the repo becomes the folder "nvim" as your base config. Opening up nvim does the rest.
the init.lua
file in nvim
is chock full of comments and README stuff. I searched and skimmed around for ways to turn on rust-analyzer
with lsp-config
, it was a line I had to uncomment.
Also had to uncomment a line to include new plugins, like the typescript neovim plugin I found. Installing stuff through this plugin manager thing called Lazy
is basically a matter of appending table values to the return table value in lua/custom/plugins/init.lua
. Opening up nvim handles installing stuff.
the LSP also does autoformatting. It will touch your files once they are opened, so note to self be sure to commit the files before doing anything when they are opened in neovim for the first time.
15:59: Back to the hub. Now what.
16:00: Work out initial TeX backend based on cv.tex #resume-setup #timelog:00:43:48
16:47: play with the LED sign here in hub #LEDsign #timelog:00:12:27
oscsend did not work. But: HACK THE PLANET
18:44: now what.
18:48: Add glottis parameters. #demo-react-UI #timelog:00:18:50
19:08: Now for the region sliders #demo-react-UI #timelog:00:12:18
19:20: Sliders React demo is currently 600kb #how-big-are-react-apps
Did this by running npm run build
then examining size of "build" directory.
By comparison, my vanilla JS project was only 12kb.
19:44: Initial importing of webaudio stuff #demo-react-UI #timelog:00:32:39
2024-07-05 Day 47. Poke Prototype Published
jump to top35 days left of batch.
Prev: sounds and visuals / connected together for / the poking demo
Next: || demo-poke: add eyes and a mouth, sync mouth shapes to sounds, tweak sounds. || resume-setup: work up a little lua script to generate tex code using my old TeX template. || how-big-are-react-apps: Didn't get to this the other day. Experiment and observe. || demo-react-UI: Try to actually get to adding sliders to this. || investigate-kickstart-neovim if there is time. || react-managing-state read "passing data deeply"
07:49: Morning triage
08:06: Planning Demo Poke tweaks #demo-poke
Visuals: eyes and mouth, make sure they all collide against walls. Sounds: use envelope to modulate pitch, used smoothed gate for amplitude (decay doesn't work).
Resist the urge to tune and tweak all day.
08:09: Plan: use existing data today to make Tex Generator #resume-setup
08:11: Try to look into this today #how-big-are-react-apps
08:13: Try to add sliders today #demo-react-UI
I'm honestly losing motivation for this, but it's important to me that I do this in order to learn.
08:15: If there is time #investigate-kickstart-neovim
This is a pretty open day, so there's a chance I'll have time for this.
08:17: Two more chapters left here #react-managing-state
"passing data deeply" (23 pages) and "scaling up" (20 pages)
08:18: Publishing
08:30: Let's work on a face for my little guy #demo-poke #timelog:00:16:03
08:46: Initial face created. Next up: mouth sync #demo-poke #timelog:00:58:42
open/close and shape morphing
09:24: Initial open/close created. Now to interpolate shapes. #demo-poke
09:47: Testing on Android #demo-poke
09:49: Where did the tap to begin go? #demo-poke
09:55: Tuning and tweaking #demo-poke #timelog:00:49:37
09:56: First up, expand the poking radius #demo-poke
Sometimes the pokes don't register because you are out of bounds.
10:00: Map envelope to fundamental pitch, rate, jitter #demo-poke
10:16: add some jitter controlling velum #demo-poke
10:22: remove some low end #demo-poke
10:25: fine. add reverb #demo-poke
10:32: draw the frame last #demo-poke
That way, eyeballs can go off screen. I'm not going to do collision checks on mouth/eyes. Not worth the time.
10:39: More pitch tuning #demo-poke
I wasn't hearing what I wanted to hear. Turns out the jitter was getting in the way. 10:45 Android testing again #demo-poke #timelog:00:03:38 Feels okay enough. Maybe a bit too shrill on phone speakers. But I'll worry about mixing another time.
10:50: Merged "chatter" branch in voxbox. #demo-poke
10:54: can I still build my old tex resume? #resume-setup #timelog:01:00:06
Short answer is: no.
Okay, so I just want to preview the data right now. I'm going to render to markup. 11:49 I wrote a CV in plain tex once, trying to compile #resume-setup It still works!
13:13: Back to the resume. Filling in more details. #resume-setup #timelog:01:04:52
14:31: More slider-ing #demo-react-UI #timelog:01:11:08
15:45: Pushing initial poke demo online. #demo-poke #timelog:00:07:39
16:00: Presentations
16:48: Work on week 7 recap.
18:56: Attempts to build FAUST from source. #build-faust-from-source
Gotta love this. There's a header file called execinfo.h
that doesn't exist on musl (alpine) and there's a specific macro that disables including this file. There's probably a way to set this macro using a flag somewhere in a deeply nested Makefile, but it's just going to be easier to comment these bits out. It's only this file. FAUST is like this sometimes.
It is building...
I get weird permission issues when it builds? https://stackoverflow.com/questions/72978485/git-submodule-update-failed-with-fatal-detected-dubious-ownership-in-reposit. So sketchy.
19:19: Attemping to mod cmake files. Sigh. #build-faust-from-source
19:26: sketchy permissions warnings again. #build-faust-from-source
I have this feeling that one of the optional submodules is causing issues.
Oh, it was more sketchy stuff that I had to comment out. The "install" target of the Makefile had this weird line that was triggering the git issue.
2024-07-04 Day 46. Connected poking sounds to poking visuals
jump to top36 days left of batch.
Prev: built jitter signals / summer 2 conversations / learned about old pipes
Next: || create-chatter-sounds: In the example, add poke interactions, somehow. || demo-poke: Add eyes and mouth. || how-big-are-react-apps: Figure out, how big are these react apps when exported? || investigate-kickstart-neovim: See how far I can go getting kickstart to work with neovim.
08:32: Morning Triage.
Starting this at home on my Mac.
09:01: Poke interactions in chatter: thoughts. #demo-poke
The offline example should have some parts that will eventually be designed to be modulated by user input (poking). Poking can be simulating using a triggered gate. The gate would go into an envelope (envelopes?) of some sort as way to shape things like fundamental pitch and jitter.
09:05: Eyes and mouth thoughts #demo-poke
Basically, my source of inspiration for this one are those poptart commercials that were around during the early 2000s <<demos/poptarts>>
It'd be nice to get collision working for the eyes, because I think they will bulge out of the face occasionally.
Mouth shape needs to be flexible enough to support mouth shapes. I want these to be synchronized with the mouth shapes being synthesized by the audio.
09:12: Trying to look into this today #investigate-kickstart-neovim
09:13: I would like to have a working demo by EOD Friday (tomorrow) #demo-poke
09:14: Publishing logs from yesterday
09:29: I really need to focus and get this done by EOD Friday #resume-setup
I keep procrastinating and yak-shaving on this.
10:04: Adding tgate. #demo-poke #timelog:00:19:46
10:24: Creating metro #demo-poke #timelog:00:06:45
This will drive the tgate. This can be made from a phasor. Any time the phasor resets, produce a trigger.
Created a phasor to trigger utility called PhasorTrig
10:32: Initial test of tgate in chatter example #demo-poke #timelog:00:10:10
Actually, I'm building out a metro that combines phasor and phasortrig.
10:42: I want an envelope now. #demo-poke #timelog:00:26:20
I will opt to port the envelope I made in sndkit.
11:10: Initial envelope code ported. Now to see if it works. #demo-poke #timelog:00:07:16
11:13: Loud distored noises (it didn't work). investigating #demo-poke
Now it works. Forgot to add the exp
and update pgate
.
11:30: Lunch
13:34: Refactoring chatter, getting it ready for poke. #demo-poke #timelog:00:42:36
Chatter sound needs to be consolidated into a struct rather than in a main function. There also needs to be a poke method, which will eventually be the thing called when the user pokes the circle on the screen.
13:35: At this point, the base chatter sounds have been made #create-chatter-sounds #demo-poke
So I think this task can be considered completed. The rest of the sound design work will be polish for the poke demo.
14:16: Fully encapsulated chatter into struct #create-chatter-sounds #demo-poke
14:17: Now, to implement poke method #demo-poke #timelog:00:12:11
Up to this point, "poking" has been simulating using a metro object. This metro needs to be pulled out of the struct, and replaced with a gate fed into a threshold generator. The metro function will exist outside, and when it ticks an impulse, it will call the poke method. The poke method will flip a gate signal on and off. Inside ChatterBox, it will detect changes in the gate and turn those into triggers like before.
15:27: Making sure graphics work on my mac #demo-poke #timelog:00:01:34
15:29: Connecting sound to graphics. #demo-poke #timelog:00:17:55
It works! It's a coarse experience, but it sorta works.
15:56: Trying out on Android. #demo-poke #timelog:00:03:15
termux is awesome, because I can run it locally on my phone.
It works. definitely needs some refinement. But a good enough stopping point.
16:22: Add "tap to begin" mechanic #demo-poke #timelog:30:12
17:06: Debugging tap to begin on mobile #demo-poke
17:15: Attempting to SSH #demo-poke #timelog:00:16:55
18:14: Maybe a div/canvas onClick event handler will work? #demo-poke #timelog:00:06:17
Okay that works! Nice.
2024-07-03 Day 45. Jitter, Summer 2s First day in Hub
jump to top37 days left of batch.
Prev: Tiny Creature Sounds / Told Told Some People About It / Then I installed zig
Next: || create-chatter-sounds: Build up some jitter constructs. || add-demos-page: Quickly made a demos page with links. || react-managing-state: Most likely will have time to do some reading on this. || resume-setup: Building the metadata structure out more. || dagzet-lists: Implement ordered lists in dagzet (needed for dagzet setup).
07:59: Adding some words to landing page
08:00: Morning triage.
08:08: I want to build some jitter constructs today #create-chatter-sounds
The first will be a randi like module clocked with an external phasor. The second will be a phasor whose rate can be randomized every period. Together, they can build a pretty decent jitter signal.
I'll probably be extracting my LCG logic into an RNG module as well.
08:10: Going to tackle this this morning #add-demos-page
08:11: Ordered lists in dagzet needed for resume #resume-setup #dagzet-lists
I know, I know, I'm overengineering this.
08:17: Kickstart: way to set up neovim with LSP? #investigate-kickstart-neovim
Dan mentioned using VSCode again, and I'm inclined to take his advice. However, before I try that, I want to see if I can get some decent LSP things working with NeoVim. Would be especially helpful if I can get it running on this Alpine thinkpad.
Found Kickstart while trying to google "typescript in neovim". This was the other thing mentioned too: https://github.com/pmizio/typescript-tools.nvim.
08:21: Getting ready to publish.
08:36: Start implementing ordered lists in dagzet #dagzet-lists #timelog:00:45:56
As reference, there is my resume dagzet code which has some prototype syntax.
08:35: I really need to rewrite dagzet #dagzet-rust
My lua implementation was always designed to just be a prototype, and nowadays it's getting quite heavy. Now that I'm a bit more familiar with the language, I don't think it'd be too difficult to build it in Rust, tbh. I think the standard library has enough rich data types that it should be pretty straightforward.
08:52: Now I need to think about table schema #dagzet-lists
I think I know what I want in the schema, but it comes down to naming. Working out the words here.
A list belongs to a node, and has items in the list, which are other nodes in the current namespace. Each item has a position in the list, which is known. I will want these positions to be zero-indexed.
09:05: I am inconsistent with design philosophy here #dagzet-lists
This approach I'm using for representing lists is more SQL-y compared to say, how I'm handling dz_lines
. I'm using more rows: each list item gets a row with a position. In theory, I should be able to reconstruct the list using some join and amalgamation operation in vanilla SQL.
By comparison, the other parts of this code represent lists as JSON arrays encoded as strings. SQLite does come with JSON bindings now by default, so it seems this is reasonably portable. But it does require that extra JSON parser, and a part of me wonders if that was the right decision.
So, consider this an experiement. I'll have both: more SQL-driven and more JSON-driven, and I'll see which one feels better over time.
09:10: Making SQL query to reproduce the list #dagzet-lists
Nice! There was a double join on the same table I had to do and I pretty much just guessed it.
The query I made:
SELECT dz_nodes.name, group_concat(itemlist.name) FROM dz_lists
INNER JOIN dz_nodes on dz_nodes.id == dz_lists.node,
dz_nodes as itemlist on itemlist.id == dz_lists.item
GROUP BY dz_nodes.name
ORDER BY dz_lists.position;
09:35: Off to make a simple RNG module #create-chatter-sounds #timelog:00:07:07
09:41: Implementing randi #create-chatter-sounds #timelog:00:17:31
09:59: Testing random line on chatter example #create-chatter-sounds #timelog:00:08:54
10:09: Working out jitter line #create-chatter-sounds
An ideal jitter line would be a random line segment clocked a phasor that some kind of variable fluctuation in timing. But how does that phasor's rate get controlled?
The best answer I can come up with is to have a specialized phasor whose rate can be smoothly randomized within a range. This, combined with the random line generator made previously, would build up the components for a jitter line generator.
As a test, this random phasor could be used on the chatter test to modulate how quickly the shapes are changing.
10:18: Building out random phasor #create-chatter-sounds #timelog:00:21:22
Yes, I think I'm going to call it RandomPhasor. I'm going to copy and paste the phasor code for now, since there's not a lot of code. It just doesn't seem worth it to DRY it right now.
10:53: Implementing Jitter, then that's enough of this for the day #create-chatter-sounds #timelog:00:19:03
This completes the idea I was thinking about
11:50: Got roped into conversation, but clock is resumed #create-chatter-sounds
Unexpected Knowledge Acquisition: Learning about some of the old oil pipe constructions in the US, and their ecological/environmental implications. Apparently some of these pipes were transferred in carts in a way that the pipes would rub up against eachother. The friction would produce weak spots, which would turn the pipes into "swiss cheese". I just love that imagery.
Someone in summer 2 wants to work on boot sector games, and I am in full support of this.
15:31: had lunch, lots of conversations, now on floor 5.
15:32: back to adding structure to resume. #resume-setup #timelog:00:40:44
15:50: I need to think about how I'm going to template this. #resume-setup
As a sidetrack, try to see how hard text templates
16:16: This was a very unproductive 40 minutes. #resume-setup
I tried exploring go templates briefly. I don't think this is really going to help me. The data format is what is most important. I think I'm just going whip something up in janet, shell, or lua.
17:59: After crashing at my place, Back at the hub.
Triaging, figuring out what to do next.
18:02: Time to whip up a demo page. #add-demos-page #timelog:00:20:48
Oh. Looks like I had a placeholder page here already.
Ah, what makes this interesting is that these demos aren't wiki pages, can't use ref
, and you can't have them be typical links either.
add a new janet function for creating these links called "pagelink" which doesn't do any checks.
18:37: React Reading: Extracting State Logic into a Reducer #timelog:00:46:27
19:39: Evening Triaging
19:40: I keep putting this off. #demo-react-UI
Part of my hesitation with my initial "port it to react" is: how bulky will this end up being on my website? Do I even want it there? Does it even matter if it just is code that's here in my Recurse monorepo and nowhere else?
19:42: Created "demos" taskgroup.
See: demos/taskgroups/demos.
19:45: Creating task to investigate size of react apps. #demo-react-UI #how-big-are-react-apps
One of my concerns with building a react app is that if I host these on my droplet, it's going to clog things up in and take space. What I want to know is: when I go through the process of minifying and exporting the code, how much space does it end up being? What goes into uploading and running this on my webserver?
My other concern (which I'm trying not to think about too hard about) is: my website is 99% static, and introducing JS could mean adding an attack surface to my website. Probably just being paranoid.
19:49: two minutes. quick! think of some haiku ideas.
built jitter signals / summer 2 conversations / then I napped a bit
2024-07-02 Day 44. Initial Tiny Creature Sounds, MIDI knobs, slightly more typescript, Zig
jump to top38 days left of batch.
Prev: Poking Propulsion / Works better in Chromium / Stored State in Sliders
Next || resume-setup: more structuring. overengineering stuff (of course I would do this). || create-chatter-sounds: some initial code setup for an example in VoxBox, and some sound design planning because a version of this will eventually end up in Poke. || demo-react-UI: Now that we've officially started the basics, time to move along and get all the sliders up.
08:20: Morning Triage
08:30: Finished. Next steps... #typescript-react-slider
Next steps are to port the rest of the UI into React
08:32: Ready to start working on this #demo-react-UI
I think I know enough now to start getting the rest of this UI ported to React. I can just focus on the UI elements first, and then worry about hooking it up later.
08:36: added earlier this morning #add-wasm-dz
08:37: chatter sounds are needed for poke #create-chatter-sounds #demo-poke
The most important thing for this demo is the vocal-like chattering. So this example will more or less aim to build the model. The new thing to try out will be morphing between DRM shapes. The critter here will be quite small, so the tract size wlil be quite small.
I will be thinking more about the design today.
08:40: Chatter sounds are precursor to morphing shapes with gesture #voxbox-shape-morphing
The chatter sound model will feature DRM shape morphing using a pre-defined set of tract shapes. This will be a simplified version of what I'm expecting to do with the Gesture shape morphing. The chatter system will interpolate using a phasor, but there won't be the rephasor component. The code structure will also attempt to be simplified as well.
08:47: How to factor in leet-grinding into my schedule? #what-now-week7
I want to see how much I can do of the LC75 problem sets.
08:51: Publishing and Zulip
08:59: Making some initial tract states #create-chatter-sounds #timelog:00:35:23
Going to use my phone for this, as the touchscreen is the best interface I have right now.
09:32: shape exploration is slow and tedious #create-chatter-sounds
I need to find a better way
09:43: Getting intial boilerplate code setup #create-chatter-sounds #timelog:00:27:00
10:10: initial boilerplate with shapes created #create-chatter-sounds
10:11: Time to modulate the shapes linearly with a phasor #create-chatter-sounds #timelog:00:15:28
10:43: Structuring my graph using dagzet notation. #resume-setup #timelog:00:13:10
I think I want to implemented an ordered list command.
11:00: Lunch
11:30: Webdev hangout
12:30: Audio hangout
13:30: Back to hub
14:10: Catching up with logs, hanging with Dan
15:36: A small pocket of time.
15:37: I need to implement some kind of jitter #create-chatter-sounds
Thinking about something like randi, but phasor-driven. And a phasor that can have randomized periods.
15:38: Testing out MIDI controller
amidi --dump -phw:1,0,0
15:46: Set up some boilerplate code for react typescript demo #demo-react-UI #timelog:00:15:00
16:00: Zig
2024-07-01 Day 43. Week 7. Audio Troubleshooting, Moving Circles, Stateful Sliders in React and Typescript
jump to top39 days left of batch.
Prev: The creeping answer Of a gigantic bug hunt: Check for foreign nodes.
Next: || add-wasm-dz: missing file in my repo, on one of my computers. Quick thing. || resume-setup: read the way to set up resume, and make a new one. || demo-poke: pre-work on this || typescript-react-slider: studied the boilerplate yesterday, I feel ready enough to add a slider. || troubleshoot-webaudio-thinkpad This would be very helpful to figure out today.
08:26: Morning Triage
08:33: Do this at lunch today #add-wasm-dz
08:36: Plans on doing this today #resume-setup
08:39: Pre-work on this today #demo-poke
Goal will be to figure out initial technical scaffolding, getting p5 in a windowed context, mouse pointer stuff, etc.
08:41: added a task for ensemble demo #demo-ensemble
This will build on the vocal chords demo, and will most likely be something vaguely like Blob Opera and Choir. Slightly lower priority than demo-poke because that task will let me focus on details for one face and voice, which can be a helpful learning experience when I go to do the choir app.
08:43: An actual demos page would be useful this week #add-demos-page
This shouldn't take any time at all.
08:45: Ready to actual add a slider now #typescript-react-slider
After studying the boilerplate in way too much detail yesterday <<codestudy/hello_ts_react/app_tsx>>, I feel ready for this.
08:48: I really need to make it possible to link graphs #dzgraph-hyperlinks
You can't actually use dzref
to link to graphs, only nodes. Hoping to fix that cuz it comes up a lot.
08:51: Poke demo is priority this week #demo-poke
It's a good contrast from the vocal chords demo, and a good way to start the more whimsical toyetic vocal demos.
08:56: publishing logs from yesterday
09:03: Let's see if I can find more things out on this #troubleshoot-webaudio-thinkpad #timelog:00:32:02
09:06: Glicol works, but my project does not #troubleshoot-webaudio-thinkpad
This has been a repeating theme: my audio code doesn't work everywhere, but Glicol does tend to work everywhere. Why?
09:07: Glancing at glicol JS audoi code #troubleshoot-webaudio-thinkpad
Found an interesting line:
window.AudioContext = window.AudioContext || window.webkitAudiotContext
09:17: Sound is broken in glicol now #troubleshoot-webaudio-thinkpad
More importantly: sounds suddenly worked for my vocal demo?
09:27: Not sure why but I guess it's all working now #troubleshoot-webaudio-thinkpad
Both my vocal chords and glicol demo work.
I did do some cache clearing in the dev tools, so that could be partially to blame for why it's suddenly deciding to work.
09:30: webkit is safari? #troubleshoot-webaudio-thinkpad #vocal-chords-older-iphone
Unrelated, but that webkit line may be how to get sound working on some iphone devices.
I've made a new task for this vocal-chords-older-iphone.
09:43: I am drafting up a new resume #resume-setup #timelog:00:21:27
Scew it, outlining in markdown. Text will set us free.
10:05: An outline of a resume has been made in markdown #resume-setup
it is stored in this repo under resume/resume.md
.
I should pandoc this at some point. -- okay it didn't work out of the box and this frustrates me.
Crazy overengineered idea. Write the resume in dagzet, compile to SQLite, then produce tex structure from this.
10:24: Chattering gibberish sounds are needed for critter #poke-demo #create-chatter-sounds #voxbox-shape-morphing
This could also be a good pre-cursor to the shape morphing stuff, as all the code could be a bit more relaxed and inside some sample rust code.
10:30: Attempting to get some p5 code working #demo-poke #timelog:00:27:02
10:41: the isorhythms code works, but, it chokes. a lot. #demo-poke
Not a good sign for future developments on the thinkpad.
10:58: Trying chromium #demo-poke
Chromium plays isorhythms okay! Just one of those things.
11:30: Back home for Lunch
13:00: Attempted C Creatures Hang
I was there for 15 minutes, and nobody showed up.
13:40: Back at the hub. Back to p5 context? #demo-poke #timelog:01:00:08
Now that things work if we "run in chromium" (sigh), things can move forward.
13:47: Found the documentation #demo-poke
Found the documentation: https://archive.p5js.org/reference/#/p5/p5.
14:07: Initial p5 boilerplate working #demo-poke
Going to put it in a 400x400 window, or something.
scratch that, let's see what 400x680 does. approximately pacman proportions.
14:33: Adding wall collisions #demo-poke
It doesn't work, but it's a good start.
15:57: Nice pairing with Binh on getting a slider in React #typescript-react-slider #timelog:00:15
Approximating the timelog to be about 15 minutes. Solution was much less complicated that my thinking. Instead of wrapping slider in react, just use stateful hooks.
16:28: Trying to make the bounce work #demo-poke #12:14
16:33: Oh, circle is diameter, not radius. #demo-poke
16:43: Closer, but not right still. Pen and paper now #demo-poke #timelog:00:15:00
17:00: Lots of pairing and people stopping by to help me #timelog:01:00:00
And we are done!
19:47: React Reading: preserving and resetting state. #react-managing-state #timelog:00:43:21
20:37: Fixing some dagzet code in hello ts typescript
20:45: Thinking about haiku bits for tomorrow.
Poking Propulsion
Worked Better In Chromium
Markdown Resume
New Resume Time
Stored state in sliders
2024-06-30 Day 42. (Something something Deep Thought)
jump to top40 days left of batch.
Previous: Typescript and React \ Topological Sort Bugs \ Singing Quite Loudly
Next: || voxbox-shape-morphing: set up initial mechanics for morphing shapes in voxbox. || troubleshoot-webaudio-thinkpad: Figure out why sounds aren't playing in the browser on my thinkpad. || typescript-react-slider: Try to get a single input slider working in React using typescript
09:43: Morning Triage.
09:50: Sunsetting morning triage task #morning-triage
I initially was going to time my morning triage stuff. But I don't think I want to do that.
Also, comments on the triage itself aren't all too helpful. I'd much rather comment on tasks themselves
09:52: core mechanic for this shouldn't be too hard #voxbox-shape-morphing
09:58: webaudio is broken on firefox thinkpad. #troubleshoot-webaudio-thinkpad
Sounds do work on youtube, so I am getting browser sound. No clues in console. Locally hosted versions do not work either.
10:02: Uploading and posting to zulip
Ah, still running into issues with uploading. I didn't finish setting things up.
Ooops I just blew up my recurse site. ROLL BAAACK.
10:11: Oh god it's all broken. starting a new task. #site-debugging-jun30
Seeing what I can accomplish in 15 minutes, then I have to quickly leave for errands.
10:15: Ooops forgot to install rsync. #site-debugging-jun30 #timelog:00:20:00
10:16: codestudy is getting errors in export #site-debugging-jun30
Looks like the escape string is wrong. I am using "'" instead of """. I did this to myself when I was trying to update the SQLite generator scripts (they got stricter about single vs double qoute usage apparently. Technically, you're supposed to use single quotes only, but they used to allow both interchangeably).
10:23: I deleted some tasks #site-debugging-jun30
I think this is fine actually? Looking at them, it seems like I meant to delete them. I was worried that I accidentally deleted the tasks or something.
10:35: Errands
10:51: I found a Haiku during my errands
I've just decided to start doing my daily check-ins as Haikus, so I was destined to find this today. It was printed on a wall of a builing. A haiku by Richard Wright. The creeping shadow / Of a gigantic oak tree / Jumps Over the Wall
11:36: Back at Hub. Setting up.
11:37: Initial voxbox-ing setup #voxbox-shape-morphing
11:39: Oh, right. I need to build mplayer form source #voxbox-shape-morphing #mplayer-jack-thinkpad
This thinkpad isn't fully provisioned yet. Upstream mplayer doesn't have jack support, so I need to build it from source.
I remember this taking a while. Uh, maybe I should look at other things.
Nevermind? It just finsihed as I wrote this. That's not a great sign.
Ran into build issues. Great. Downloading from git, using the APKBUILD as hints.
MPlayer is kinda SVN? But this build script is using git?
11:40: Attempts to get mplayer working on thinkpad begins. #mplayer-jack-thinkpad #timelog:00:48:13
12:01: Going to try building the package instead of source. #mplayer-jack-thinkpad
Not what I wanted to do, but here we are.
Bandwidth is slow even at the hub. git feels slow too. I wonder how much faster it'd be on a newer laptop.
12:09: going to attempt to get abuild to run #mplayer-jack-thinkpad
12:19: It builds! But I wasn't sure how to install it #mplayer-jack-thinkpad
Build time wasn't too bad, so I'm rebuilding.
12:20: Where is "pkgdir"? #mplayer-jack-thinkpad
Package installed in packages/community/x86_64
.
12:24: attempting to install manually #mplayer-jack-thinkpad
Can be done with apk add foo.apk
. I don't think I needed the --allow-untrusted
flag because it was self-signed.
12:26: works! #mplayer-jack-thinkpad
I love how straightforward that was.
13:35: Back at the hub. Thinking about what to do next.
13:39: Made a formal task for slider #typescript-react-slider
I created the project with this command: <<webdev/npx_create_react_app_typescript>>.
13:41: Time to think about resume this week #what-now-week7 #resume-setup
13:51: Finally getting around to implementing initial shape morphing #voxbox-shape-morphing #timelog:00:09:00
I think...
13:58: This isn't going to work the way I thought, and I'm happy about that. #voxbox-shape-morphing
Initially I was going to add a small method to the tract that linearly interpolates two DRM tract shapes, given a position. BUT, the DRM shapes aren't in there. So, I actually have to plan out a new interface.
14:00: Working out interface (ink and paper) #voxbox-shape-morphing #timelog:00:14:34
14:35: Making a cooler looking lock screen for swaylock
14:48: Re-examining GSG code. Generics possible? #voxbox-shape-morphing #timelog:00:25:03
Basically, I want to generalize the values that can be interpolated. It is currently set to be 32-bit floats, BUT, it'd be nice to make them generics. That way, the shape morphing can work on those generics.
15:01: some initial refactoring completed #voxbox-shape-morphing
The core Gesture and GestureVertex structs take in generic arguments, with the rest hardcoded to take in f32. This was much less friction than I expected.
Next steps: have the linear gesture signal generator take in generics as well. After that, the generics can be refit to take in shapes. Somehow. (Indices? references?)
15:40: attempts to look at this typescript sample code #typescript-react-slider #timelog:00:54:42
Does it look like the other react code I've been examining?
17:15: React Reading: Sharing State Between Components #react-managing-state #timelog:00:24:29
18:17: Another session of troubleshooting #bug-missing-nodes #timelog:00:51:53
18:24: Ah, the node IDs changed I think. good for me. #bug-missing-nodes
Actually not too much of an issue, I have enough printing happening now that I can update when/if needed.
18:29: Examining all the connections now #bug-missing-nodes
The magsign_cpp
is what we are investigating. What would need to happen for it to not show up in cpp
?
18:33: =magsign_struct= is the only child that doesn't exist #bug-missing-nodes
Not sure if that is helpful or not.
Of the three, It's also the only one that only shows up once as a left-node (it only has one parent node). That parent is magsign_cpp
.
18:39: It's probably the external nodes. #bug-missing-nodes
As we go down this tree, we are approaching the only two external nodes here: rust/std
and rust/std_iter_zip
. The TOC generates the links just fine, but it just seems like anything that has those links doesn't want to show up.
18:40: Also, edges for those external are returning NULL. #bug-missing-nodes
What's up with that?
18:41: commenting out the external nodes, I want to see what happens.
18:43: People of Earth, we have our smoking gun. #bug-missing-nodes
Removing those external nodes causes it to render properly.
18:47: Working out another hunch #bug-missing-nodes
I don't think the edges are supposed to be nil, even though they are here.
I have another hunch: there is a local node called "rust", and the namespace for these external nodes also starts with "rust". Going to re-introduce the external nodes, and change the local "rust" node to be "rustrust". It could be that there is some faulty logic related to string matching somewhere.
Neither worked. Okay, so I have to check out those nil edges now.
18:51: Tracking down those nil edges. #bug-missing-nodes
19:00: I think I did it. #bug-missing-nodes
While populating a local copy of the edges table (to be pruned later by the topsort algo), I added a check to see if the left node id shows up in the nodes list. If it doesn't, it indicates that it is an externally linked node outside of the namespace.
TL;DR: Check for foreign nodes.
19:04: cleanup debug code #bug-missing node
19:37: refactor linear gesture to take in generics #voxbox-shape-morphing #timelog:00:21:41
Okay that didn't work.
19:49: stop writing code, think about the problem. Why generics? #voxbox-shape-morphing
This gesture is currently hard-coded to work with floats as values. Which, most of the time, is fine. However, for shape morphing, the values that need to be interpolated aren't floating point values, but tables.
The hope with generics was to be able to elegantly use the existing linear gesture logic to create gestures that morphed between DRM shapes.
Things would be set up in such a way that when tick()
would be called, it would know to interplate between two DRM tables and store it somewhere in another table. That table could then be taken and passed to the vocal tract.
Stopping for now because I've had enough of a day.
2024-06-29 Day 41. Halfway.
jump to top41 days left of batch.
Exactly halfway now. Perfectly balanced.
Moving forward, I'm going to be reducing my check-in presence on Zulip now. I'll still post links to my daily logs, but no words until the end of the week (Friday), where I'll do a weekly digest.
I have an idea-concept for my interactions at the hub. I want to be my physically present here. Now that have this Fancy Laptop, why not try to set up a usual spot and work out in the open? The corner of the fourth floor lunch table is a strategic location. Lots of people around here, and it's also by a power outlet (may need to get extension cord). My goal is to become furniture at the hub. Be the rock in the center of the river.
Actually, what if daily check-ins could be like haikus? It's fun and creative, while also being an exercize to summarize the day.
Haiku for previous day: where did my nodes go? / investigate TIC-80 / remaining weeks planned
Today: || bug-missing-nodes: Plug away at this for a bit. || react-managing-state: Print this out, start reading a bit. || buy-extension-cord: I need a short extension cord for my charger so people don't trip over it. || toc-logs:
10:28: Late morning thoughts at the hub.
10:42: Didn't get around to doing this one. Oh well. #voxbox-vcv
10:43: I studied enough here. #vcv-potential-study
It was a helpful excercise, but I need to move forward.
10:44: Bumping to priority because the page is starting to get big. #toc-logs
10:46: Need a cord so people don't trip over my AC adapter #buy-extension-cord
Probably a target run today.
10:48: zulip-ing
10:51: Oh shoot, this machine isn't able to push to my website. Going to have to fix that when I get back home.
13:03: Extension cord acquired! #buy-extension-cord
11:20: Quick pairing with M: learning how to make typescript react
installed npx:
sudo npm i -g npx
Got to figure out why this npx command didn't work:
npx create-react-app my-app --template typescript
Solution: https://stackoverflow.com/questions/7718313/how-to-change-to-an-older-version-of-node-js
Instll n, run n stable
. Basically, the version of node was "bleeding edge".
And we're running!
14:01: Getting thinkpad SSH key ready #thinkpad-uploader
14:38: Back at the hub.
14:39: Looking at this again today. #bug-missing-nodes #timelog:01:04:58
14:44: Do the nodes ever make it in the main sort loop? #bug-missing-nodes
Using magsign_cpp
as an example, which currently has the id of 146.
It doesn't seem to be showing up.
14:52: checking connections #bug-missing-nodes
Connections look right.
15:01: what are the initial "no incoming nodes"? #bug-missing-nodes
This comes next after populating connections. I have a feeling that calling table.remove
and table.insert
aren't doing the things I expected.
Checking these nodes.
I mean, I guess it looks right?
15:16: How to trace what would need to happen to reach "cpp" node (missing) #bug-missing-nodes
"cpp" would have to be added to the "no incoming" table. Which, I would expect wouldn't be happening.
The nodes_connected_to
function: which way is the direction? if "nodes connected to A" returns B, is it "B->A" or "A->B"?
It's checking if the node is on the left, so "A->B", so the parent nodes.
If this is true, header_file_looks_like_boilerplate
(id 137) should return the cpp
node (id 134). Let's test this.
It's not showing up. Nevermind that was my fault.
Okay the cpp node is indeed showing up as expected.
15:39: check how many times cpp node shows up? #bug-missing-nodes
This is insightful: the cpp node shows up 2/3 times. The missing node here is magsign_cpp
. So, cpp
might be forever missing if magsign_cpp
is missing. That's enough for today.
16:01: Time to make a log TOC #toc-logs #timelog:00:24:54
16:18: adding "jump to top" links.
16:22: functional. But I'm finding title errors now.
Hopefully I just wrote them in wrong.
16:25: It would be great to retrospectively add titles to days #toc-logs #add-day-titles
So, I've added a task for this. add-day-titles.
16:27: okay it all works now! #toc-logs
18:14: Printing and transferring to RM #react-managing-state #timelog:00:14:00
18:28: Off to hub. Some attempts to try reading there.
18:45: React reading: reacting to input with state #react-managing-state #timelog:00:32:21
19:41: React reading: choosing the right state #timelog:00:27:44
20:15: Bang on electric keys in the hub. Loudly sing half-remembered songs.
21:00: go home.
2024-06-28 Day 40. Missing nodes. HTML/WASM export in tic80.
jump to top42 days left of batch.
07:28: Get Sokol example working on new laptop #investigate-tic80
Precursor to getting tic-80 working. Oh wow that was easy. Little did I know I was 30 seconds away from getting this working last night.
07:30: Back to tic-80 compilation on new laptop #investigate-tic80
Now that sokol example works, I think I figured out all the graphics issues
07:36: TIC-80 builds, runs tetris, no sound yet #investigate-tic80
07:47: Sound works. Studying music demo. #investigate-tic80
I like the scale and form factor of this.
08:08: trying to see where export is hanging #investigate-tic80
08:12: tried running in gdb and gdb crashed. musl issues? #investigate-tic80
08:44: tic80 export works on mac #investigate-tic80
The bundle includes the tic80 card and a wasm file. I have no idea where the wasm comes from. It's not a file included in the source, and there doesn't seem to be any clear indication where/how it is being generated. My best guess is that somewhere it gets made with emscripten because I see mentions of it when I grep for wasm-y things.
09:39: At the hub.
09:40: Back to investigating missing nodes #bug-missing-nodes #timelog:01:25:41
09:41: Oh crap the ulitimate spinner is available
I missed my chance months ago (waited months for this). Can't afford it. Oh well. Still buying.
09:52: =generate_graph_data.lua= and =genpage.lua= #bug-missing-nodes
These are the missing files
09:55: Run =dz_wikigen.sh= to trigger debug message #bug-missing-nodes
09:57: The nodelist is not being created correctly. why? #bug-missing-nodes
10:00: genpage.pagedata: output nodes being made via params.nodelist #bug-missing-nodes
How is this being created?
10:00: Lua issues, debug flag is not working #bug-missing-nodes
10:12: Lua function setting the global variable sets it to true somehow? #bug-missing-nodes
10:13: Oh wait, I was setting this #bug-missing-nodes
10:18: okay weird. the debug flag is being unset somewhere apparently #bug-missing-nodes
Not the problem I wanted to troubleshoot, but hey I need this for the visibility.
10:22: more testing for visibility with traverse_node() #bug-missing-nodes
The debug mode flag is never set to true here. Confirmed with printing and filtering output.
Also confirmed that it doesn't matter if you're using the function dbg
I made or checking the variable itself. It's the same. So, I'm pretty sure the dbg
shortcut function is fine.
I need to be able to track down when this is being set, and then unset.
10:29: the debug flag is definitely being set exactly once #bug-missing-nodes
I think what's happening is the variable is being copied over when it is loaded up in the module. Hoping to fix this by using a big global that can be read everywhere.
10:35: Okay, global debug flag didn't work. Rethinking approach #bug-missing-nodes
This is NOT what I'm debugging. I just need visibility.
10:39: Crap. I was setting the debug flag too late. Oh well. #bug-missing-nodes
I have a clumsier approach now, but it works, and I've wasted too much time chasing this pre-bug to improve it.
10:42: Missing node names show up in input node list supplied to genpage #bug-missing-nodes
Confirmed that the nodes do come up, but they aren't being traversed for whatever reason.
10:44: Do missing nodes show up in topsort? #bug-missing-nodes
10:48: Topsort isn't returning enough nodes. Why? #bug-missing-nodes
should be 33, but I'm getting 20.
11:17: Let's find ou the missing nodes #bug-missing-nodes
I tried looking for something to make sense in the code, but I can't yet see a pattern for what is there and what isn't there.
11:21: Missing nodes seem to line up with the missing nodes on page #bug-missing-nodes
Clocking out for now
13:34: Troubleshooting tic80 export on thinkpad #investigate-tic80 #timelog:00:44:37
13:42: onExportHTML call seems to return hmmm #investigate-tic80
printf-ing to console works, even if gdb crashes
13:45: =tic_net_get()=: that doesn't sound great #investigate-tic80
Sounds like it might be trying to download something from the internet? Yeah, that could be a problem here.
13:51: =tic_net_get()= might not be getting called #investigate-tic80
Firslty, the printf I placed there isn't showing up. Secondly, it makes a call to empscripten_fetch
, which I do not (think I) have.
Maybe it is defined elsewhere?
Ah, there are multiple definitions here. Wonder which one is the actual one.
13:57: =tic_net_get()= is an empty function. #investigate-tic80
The tic cart loaders are pre-compiled wasm files. The path is /export/1.2-dev/html
.
14:08: The tic cart loader, as far as I can tell, is closed source. #investigate-tic80
Basically, HTML export works by fetching from a URL. This downloads a zipfile that already has the wasm in there. Exporting is simply a matter of dropping that TIC file in the in the zip file.
14:11: Options moving forward #investigate-tic80
So, it's looking like doing any kind of hacking on this would be a fairly ambitious task that would take up most of my time. It looks like this wasm file is the only thing I have. What I'd like to do is to somehow patch it, potentially. And get it to load in custom code for audio playback.
It's not a no, though it would dramatically change what it is I came here to do. Instead of doing funny vocal synth stuff, I'd be going deep into trying to reverse engineer this thing. Feels very much in the spirit of RC. And this is more motivating to me than more frontend-y stuff. There may be less polished things to show, though.
14:21: Looking into wasm2wat #investigate-wabt #timelog:00:22:34
Needed for reverse engineering tic80 wasm bundle.
I love how easy it was to build, and the tools look very neat.
14:45: thoughts on reverse engineering tic80 #investigate-tic80
Using this to make any sense of tic80 feels like something that would take me maybe months to a year to do maybe. Hacking the tic80 source code itself would take less time. I'd probably have better luck building my own cartridge loader.
I think if I wanted to go deep on wasm rather than frontend stacks, I'd definitely want to look into this. This stuff sparks joy more than javascript/React.
14:46: how is a tic cart saved? #investigate-tic80 #timelog:00:22:00
I am pretty sure it's lua5.3, based on grepping for version for the built in luajit.
Functions defined in cart.c
. Looking at music.tic
, the code for the seems to be at the very bottom.
16:48: The Big Plans #what-now-week6 #timelog:00:26:42
Ink on paper.
17:20: some triage and organizing
17:23: Friday concluding thoughts #what-now-week6 #timelog:00:27:00
I want to focus on designing more musical experiences using my vocal synthesizer, that run in the browser.
This, however, is not enough of a project, and risks being too design oriented rather than technically oriented. I am also quite seriously considering putting some work into hacking my vocal synthesis DSP written in Rust into tic80.
Why tic80? I like the aesthetic. And hacking the software's source code seems to very much be in the spirit of RC. It is also possible to export to WASM, but not with the hacks. As a stretch, it may be possible to somehow hack the wasm TIC cartridge wasm blob, or even write a custom TIC loader based on the TIC-80 codebase.
Design and presentation is still and important factor. There will be two kinds musical interactions: ensemble and solo. The main ensemble sining project will probably be something similar to blob opera. The main solo singing project will be "A Blob Named Bucket", a chittering singinig virtual pet that you have to coax out of hiding with food.
A Blob Named Bucket will be the more complicated project, requiring a procedurally generated critter generator, and a character engine simulator Bucket's behavior. For this reason, these components will be broken up into smaller self-contained demos. These need to be self contained because it's likely I may not be able to get to the final project.
The ensemble demo will probably just be for the web using p5, and will building off of the chords demo I made in the first half.
TIC-80 demos will do their best to follow some of the ideas in the p5 demos. Could be thought of as "redux" versions? I want to do my best to refine the mechanics of these demos, reduce until they are distilled into simple elegant forms of sonic interaction.
Tooling will need to be made as well. These will probably be HTML/WASM, and for practice, I will use React for these.
17:38: Careers strategy #what-now-week6 #what-now-week7
I did not think about this too hard for this. That's what next week can be for?
17:40: End of planning in week6 #what-now-week6
2024-06-27 Day 39. NeXT-ing. TIC80 investigations.
jump to top44 days left of batch.
Well, 43 days really. August 9th is a Friday.
prev: new thinkpad provisioning. react reading. codestudy mostly seems to work (though I found another weird bug which is probably unrelated).
next: || nexthacking: NeXT hacking today with JB. || investigate-tic80: Some initial tic80 investigations. I want to see how feasible it is to hack the audio engine to play new synthesizers like my singing synthesizer. || what-now-week6: I didn't really get around to this yesterday. || bug-missing-nodes: Bug hunting
08:04: Log started.
08:05: How easy is it to build tic-80? #investigate-tic80 #timelog:00:30:26
Source <<gamedev/tic80/github>>.
Starting on my Mac. Looks like pretty typical CMake stuff.
Seems to work! now off to try Linux.
Going to try the vanilla CMake TIC80 instructions first. I don't expect it'll work and I'll need to install packages.
Holy crap, the CMake script ran without any issues. That's surprising.
Running into Makefile errors. Turning off -j flags.
"RAKE-NOTFOUND: not found": is this a ruby thing? It sounds like a ruby thing. Installed ruby-rake and mruby and then re-ran cmake. The build is moving forward.
TIC-80 seems to include most dependencies in the source directory. This probably makes the thing a bit more portable. Reminds me of how I think.
Wow! It built. It looks like there may be support for JACK via pipewire. I've never actually looked into pipewire, but if there's a way to use it with my existing JACK applications, that would be great.
Oh crap. Alpine just permanently removed jackd when I installed pipewire-jack.
Pipewire-jack research is going to be half a day of research. Going to hold off for now. Fortunately, I managed to re-install old jack.
If there is pipewire support for JACK that works with TIC-80, I'd probably want to use that to get sound working on my Linux box. Further reading <<audio/pipewire_docs>>.
08:38: Morning triage.
08:48: Deprioritizing. #voxbox-shape-morphing
It's going to need to get done, but I can't have it in the priority bin like this.
08:49: I think I've done enough for this research #investigate-2d-gameengine
I've just had the idea to look into tic-80 and hack the audio engine. This seems to be most in the spirit of RC, and probably going to be more fun than trying to get a bunch of graphics libraries working.
Sokol would be what I'd want to try as well for something similar to tic-80 graphics-wise, but that's more upfront work.
There's always p5.js and friends for very quick rapid prototyping. I really need to convince myself to try using these for time constraints.
08:52: I really need to get this done soon. #bucket-initial-scoping
Basically, "A Blob Named Bucket" will most likely be a culmination of other mechanics I try out.
As I type this. I can't help but wonder if this will be too time consuming to try and implement.
You know what. I'm going to story-board the gameplay for bucket. That's what is missing.
08:55: Task created #bucket-storyboard #bucket-initial-scoping
A good follow-up to my initial scoping. I'm kind of paralyzed with tech stack choices, and I'm very limited with time, so boiling down game design is a good choice. I know how with a certain degree of accuracy how long pen and paper takes.
08:58: Finished reading this chapter yesterday #react-describing-ui
09:00: Next react chapter: managing state #react-managing-state
09:01: poke demo moved back to main #demo-poke
Probably won't get to it until next week. This has been added to my priority group.
09:03: Getting zulip check-in ready
09:28: Some NeXT research #nexthacking #timelog:00:07:08
See: <<retrocomputing/next>>
09:39: Quick time looking at the missing nodes #bug-missing-nodes #timelog:00:17:01
10:00: NeXT-ing
16:00: Presentations
20:30: Home
2024-06-26 Day 38. Thinkpad arrival. Space Jam.
jump to top45 days left of batch.
My social battery is pretty drained, to be honest. It was a difficult day/night for me yesterday. Didn't sleep well. It's important to me that I push myself to keep showing up and interacting with people. I'm realizing that "working at the edge of your abilities" doesn't necessarily mean technical abilities.
Prev: codestudy, planning, lots of hub activities.
Next: || provision-thinkpad: my laptop came yesterday! Going to pick it up and provision it today. || htmlize-codestudy-files: still need more time on this. || what-now-week6: commit some brainstorms to tasks.
09:14: Morning Triage
09:25: Prototyping the initial game using p5 is probably a good move #bucket-initial-scoping
Other than the sound engine working, getting the drawing pipeline working feels like it'll be the most time consuming thing. I don't know if I want to spend my time doing that.
09:35: Writing out zulip check-in.
09:47: Trying out mini-wasm-renderer
See <<webdev/wasm/mini_wasm_renderer>>.
It works!
09:54: Attempt to pick up my thinkpad laptop #provision-thinkpad
10:07: Laptop turns on. Good. Now to get Alpine image downloaded and installed #provision-thinkpad
12:04: I have a decent enough system now
12:18: Working on getting nmcli configured
Using the alpine wiki
12:26: networkmanager service stops when I try to list wifi networks
I need to figure out how to troubleshoot this
It's in /var/log/messages I think, tailing now.
I can get more verbose output with service networkmanager start -v
12:35: rebooting
12:37: Okay nmcli can list devices now, attempting to connect
I get an "not authorized to control networking message", which is a known thing in the Alpine wiki. There is a config file I make:
[main]
auth-polkit=false
12:42: we have a connection!
12:43: Going to run over to the hub and try to connect this thing to their WiFi
I will feel pretty much accomplished by then.
13:30: (approx.) Extended Lunch
15:19: OLED black figured out, back and forth
15:45: Re-acquainting myself with last progress. Final remaining bits with any luck #htmlize-codestudy-files #timelog:00:23:06
16:05: The CSS works, now the links need to be hooked up into the HTMLized output #htmlize-codestudy-files #timelog:00:15:43
16:22: Test it out on website. #htmlize-codestudy-files #timelog:00:18:43
16:32: Why are nodes missing in "potential" codestudy? #bug-missing-nodes
For example nodes like "cpp" and "magsign_cpp" are missing.
It's not being added to the data.
jq .nodes[] | .name data/codestudy/potential/index.json | grep magsign_cpp
18:34: React Reading #react-adding-interactivity #timelog:00:18:40
20:00: SPACE JAM.
2024-06-25 Day 37. HTMLize codestudy work, half-baked demos
jump to top46 days left of batch.
A table of contents sure would be nice, huh. toc-logs.
Did you know Philip Glass used to be a plumber?
Prev: "A Blob Named Bucket" planning, reflect on RC time so far, C creatures, Reading.
Next: || htmlize-codestudy-files: Did not do yesterday. Try to tackle this before leaving for Hub. || bucket-initial-scoping: Did some brainstorming, now I want to work out some actionable items. || Lots of RC things today: Webdev Meetup, Audio Hang, Graphics Hang, Half Baked Demos
09:29: Morning Triage.
Late start. Ironically, I was trying to wake up earlier, but it backfired.
09:32: I need to do less. #what-now-week6
There is always the temptation to bite off more than one can chew, and once again I feel like this is starting to happen. I'll probably want to leave this up for the rest of the week and see if any more thoughts emerge.
09:36: Make tiny games before this small game #bucket-initial-scoping
I want to build demos exploring a some of the core "verbs" I wrote down, like "poke", "eat", "move (to target). Probably in p5. Sound design would be a great thing to garnish it with after. Sound design will be another task too.
09:56: Zulip Checkin
10:02: Initial refactoring attempt of my weewiki system #htmlize-codestudy-files #timelog:01:04:09
Basically, there are default header/footer functions for HTML pages. I need to rework this so that the dagzet files use their own custom header/footer functions.
Does weewiki store the current page name somewhere? Yes, it's ww-name
.
I'm going with global mutable state, because that's ultimately going to be easier.
Oh boy, this is tricky. By the time the Org gets parsed, the headers/footers are already set up. This means I can't actually disable the templates from the Org file itself. It needs to come from a lower level.
I can add the files to a list, but that's error prone.
Okay, files are going to have the "dzfiles" or perhaps "dzf" extension for terseness. So for example, a file called codestudy/potential/mag_sign.rs
would have a prefix of dzf/codestudy/potential/mag_sign.rs
. My weewiki server would then be able to filter those pages and disable templates entirely. So far this is the best hack I got.
testing out hacky idea
Initial idea working. I'll need to inject the rest of the HTML next.
10:03: Oh wow my laptop is coming today gotta pick that up.
11:00: Prep For Hub
11:30: Webdev meetup
12:30: Half of Audio Hang
13:00: Lunch
Halal cart nearby. Pretty good.
14:00: Graphics
15:00: More demo planning #what-now-week6 #timelog:00:23:46
16:00: Half-baked demos
I showed my "milowda" work and my vocal chords demo. I feel a little self-conscious about showing the vocal chords demo, as it was a little too polished. This work is also very personal too me, so I felt quite anxious after presenting. Did not expect that. I'm usually pretty okay with presentations.
17:30: head home.
2024-06-24 Day 36. Week 6. A Blob Named Bucket.
jump to top47 Days Left of Batch.
This is now about at the halfway point. Going to have to start thinking about what happens now.
Gandalf: "All we have to decide is what to do with the time that is given to us."
I spent this last weekend brainstorming an interactive experience involving my vocal synthesis engine. Next thing you know, I'm shopping around for small cross-platform game libraries that can handle pixel art okay. Sokol so far has been the winner <<gamedev/sokol/github>>.
Working title for my little game: A Blob Named Bucket.
Next: || htmlize-codestudy-files: I ran into a bit of a hitch yesterday, hoping to return to this. || bucket-initial-scoping: Set up some tasks for this. || voxbox-shape-morphing: Begin planning how to add shape morphing with linear gesture. || what-now-week6: Halfway point. Stop, Reflect, Plan.
08:15: Morning Triage.
08:22: De-prioritizing. #voxbox-better-size-control
As I think I've mentioned already, this is turning out to be more of a DSP problem, less of a programmer problem. It'd be cool to have at some point, but not during my time at RC.
08:26: De-prioritizing #demo-tongue-control
I think I need more musical demos now.
08:27: Initial investigation complete #investigate-2d-gameengine
Sokol and/or SDL is probably going to be my solution moving forward. The general approach is to use textures and rectangles for sprites. There is documentation on this that I can read.
08:30: It would be great to get this done this week #htmlize-codestudy-files
I think this is a very important aspect.
08:33: This could still work as a self-contained demo #demo-poke
This could be done in a p5 app. I just want a circle that giggles and jitters when you poke at it. It'd be a good pre-cursor to other experiences I want to make like "A Blob Named Bucket".
08:41: Working on Zulip Checkin
09:11: Done with Zulip Checkin. Walk and think?
10:45: Brainstorm: Bucket Verbs #bucket-initial-scoping #timelog:00:09:38
10:56: Brainstorm: Recurse: what now? #what-now-week6 #timelog:00:12:36
11:30: Lunch
12:00: Nap
12:30: Get ready for Hub
13:00: C Creatures
14:47: Quick pocket for react reading #react-adding-interactivity #timelog:00:06:55
15:00: Some pairing with Dan
16:45: Off Home.
17:30: Dinner
18:53: Done writing logs up to this point.
19:50: Reading #react-adding-interactivity #timelog:00:15:08
20:05: Reading #react-adding-interactivity #timelog:00:20:40
2024-06-23 Day 35. Sokol investigations.
jump to top48 Days left of batch.
Thinkpad ordered yesterday. Should be arriving by the 28th, So, I'll have about 40 days or so of use with it while I'm here. I got a quote for the m1, and it was less than I expected, so I might still do that. With the thinkpad, I'll never have any days where I'm without a portable computer. This is the important thing.
I was thinking all day about "A Blob Named Bucket", doing research on game engines and graphics libraries, and trying to get things to work on my Linux box. I think the best contender so far is Sokol. I'm hoping to do most of the game logic in Rust. A hybridized C + Rust approach may work too, if I can get it all compiled down to a wasm file working on the web.
09:20: I turn on my computer and create this log file.
09:27: Found this 2d sprite repo today with sokol/sdl. Looks perfect. #investigate-2d-gameengine
link: https://github.com/seyhajin/webgl2-wasm-sdl-sokol-sprite.
WASM builds and runs okay with emcc. I was hoping it would have had a sprite moving, but it's just projecting a static texture.
I'm starting to feel diminishing returns, as this "simple idea" might turn out to be not as simple as I thought, and might require more time/focus than I have right now.
Fail early. Fail often.
09:55: Back to studying "sapp" #investigate-2d-gameengine #timelog:00:04:52
the "spine" examples seemed to be closest to what I was looking for, but it turns out it's also not what I'm looking for and intended for much larger games that I'd like to build: http://en.esotericsoftware.com/spine-in-depth.
10:00: Trying out a Sokol add-on for 2d graphics someone made #investigate-2d-gameengine #timelog:00:12:07
link: https://github.com/edubart/sokol_gp
The OSX build in this makefile is out of date. Getting errors. I'd like to see this work, so I'm going to try and get this running on my Linux box. If that fails, I'm back to just learning more about sokol_gfx.h
and actually reading the documentation.
Okay! The makefile works on my Linux box! Good. Progress.
Making a dagzet graph for sokol.
10:17: Studying sokol_gfx more in depth #investigate-2d-gameengine
I want to get the text rendered to a PDF somehow, so I can read it on my remarkable.
Learned about :hardcopy
in vim. :hardcopy > foo.ps
.
set popts+=number:y will include line numbers
There are two kinds of programmers: those who understand why text editors have a "hardcopy" printing feature in 2024, and those who do not.
Uploaded PDF to Remarkable.
10:30: Reading =sokol_gfx= header comments on remarkable #investigate-2d-gameengine #timelog:00:25:36
13:00: Trying out Gate #investigate-2d-gameengine #timelog:00:34:10
Found this while looking up how to do pixel art on OpenGL: https://github.com/SergiusIW/gate
running into linker problems.
Solved by adding println!("cargo:rustc-linker-arg
-L/opt/homebrew/lib")=.
Now it is panicking because it can't find OpenGL/GLES library. Probably something to do with it being a Mac?
Working out test file. https://stackoverflow.com/questions/26745284/cmake-not-finding-gl-h-on-os-x. This worked fine, though I had to change the include file from GL/gl.h
to OpenGL/gl.h
.
Digging into gate library a little more, since it is just SDL underneath it.
There's only so much stack tracing and debugging I can do before I call it quits here. This is another nope.
13:45: Another look at sokol gp on OSX #investigate-2d-gameengine #timelog:00:39:00
I think the sokol headers are out of date. OSX references things like cocoa, new sokol headers references metal.
Nope. Fortunately, there's documentation on the build incantation used to build cube-sapp.c
in the sokol-samples repo:
cc sapp/cube-sapp.c -I ../fips-build/sokol-samples/sapp-metal-osx-make-release/sapp/ libs/sokol/sokol.m -o cube-sapp -DSOKOL_METAL -fobjc-arc -I ../sokol -I libs/ -framework Metal -framework Cocoa -framework MetalKit -framework Quartz -framework AudioToolbox
I want to update the Makefile in sokol gp to match this better
Made a build.sh script. Still getting a lot of weird metal-related junk errors.
Took the clear-sapp example in the sokol-samples and gradually added lines from the rectangle sample in the sokol gp repo. We have a rotating rectangle now!
15:50: Making decisions on file design #htmlize-codestudy-files #timelog:00:15:11
Files are stored in dz_textfiles
.
Files given some sort of path, which can be analogous to a dagzet namespace URL. I'd probably opt to replace things like periods with underscores though.
Take something like codestudy/potential/mag_sign.cpp
, that could a page called codestudy/potential/mag_sign_cpp
. Some assumptions could be made that codestudy/potential
would already exist and that mag_sign_cpp
wouldn't be a pre-existing node.
A janet function could be written to generate the appropriate HTML given a file path. This could then be called using inline Janet, which could then be created procedurally using a script.
16:05: Bottoms-up: create initial render page. #htmlize-codestudy-files
Gosh there's so friction getting this to work cuz everything is so connected. I wanted to just write Janet code, but I needed other things, so I put it inside a placeholder org file and ran parse, but the CSS parse path is wrong.
16:37: We're going to need to re-work the HTML generator in weewiki.janet #htmlize-codestudy-files #timelog:00:32:04
I need to point to a brand new CSS file. Overwriting the usual one isn't going to work. Stopping for now because this will require fresh thinking.
18:23: Reading #react-adding-interactivity #timelog:00:23:08
19:14: Attempts to import small sokol gp example #investigate-2d-gameengine #timelog:00:28:31
19:41: Imported. Builds on OSX and Linux now. #investigate-2d-gameengine
19:47: Import ink.
2024-06-22 Day 34. Investigating 2d game engines. Thinkpad ordered.
jump to top49 Days left of batch.
Saturday. Nothing really planned. This is intentional.
10:02: Initial concept sketches #blob-brainstorm #timelog:00:10:36
10:35: Late morning logging.
10:42: How hard is it to throw some pixels to a framebuffer in webgl? #blob-brainstorm #timelog:00:42:09
let's find out!
14:20: Exploring pixels #investigate-2d-gameengine #timelog:00:11:15
Link: https://github.com/parasyte/pixels.
Space invaders example works on my Mac. Good lord that's a lot of dependencies though (it's okay, it's okay, just grit your teeth and do it).
Web example runs on my mac. Noticing use of wasm-bindgen. Is it time for me to bite the bullet and actually learn this? I wouldn't be able to get it done in a weekend.
14:34: Now time to look at Sokol #investigate-2d-gameengine #timelog:00:08:12
Link: https://github.com/floooh/sokol.
More "comfortable" because it's closer to a tech stack I know already. I've never used it before, so the learning curve may be just as big. But let's see!
Sigh... Emscripten.
Uhhhh... my harddrive just filled up I think.
This is an omen. I am ditching Sokol for this.
14:43: Freeing up space on my harddrive
Damn. Fast internet fills up disk space quickly.
14:55: Now we try Pixels on my Linux box #investigate-2d-gameengine #timelog:00:08:49
This is the ultimate test because nothing every wants to work.
15:05: Well that didn't work. Attempting softbuffer on Alpine Linux? #investigate-2d-gameengine #timelog:00:07:59
15:14: Well that's a no go. Sokol on Alpine linux? #investigate-2d-gameengine #timelog:00:08:10
Good ol' glfw backend works. I do not really like the strange build system.
15:24: Another stab at pixels on Alpine linux. #investigate-2d-gameengine
15:27: Trying sway to try run sway. wish me luck. #investigate-2d-gameengine #timelog:00:58:26
16:23: Got enough of sway working to re-run pixels invaders example. Still, no luck #investigate-2d-gameengine
I'm starting to think a solution might involving having different backends, with as much logic in the Rust layer. This might take too long as well. Sigh. Graphics.
2024-06-21 Day 33. Attempted better tract length control, "Milowda" chords
jump to top50 Days Left of Batch
Prev: VoxBoxOSC works!, Tongue Control in VoxBox, React Reading (finished up "Describing the UI chapter").
Next || htmlize-codestudy-files: Did not get to this yesterday. Hopefully today. || react-adding-interactivity: More reading on React. || voxbox-change-nose-length: shouldn't be too difficult to do || voxbox-better-size-control: if there's time today, try to look at this. || demo-poke: Start brainstorming the "poke" demo
08:58: Morning Triage
09:07: Small task, hoping to get to it today. #voxbox-change-nose-length
09:08: Will look at this today if there's time #voxbox-better-size-control
09:10: created. initial brainstorming today? #demo-poke
09:17: Writing up zulip checkin.
09:30: Done with morning logging.
09:48: Pack up for Apple appointment
12:00: Lunch
13:01: The poke should be a tickle. Think tickle-me-elmo, or the Pillsbury Doughboy #demo-poke
13:02: Adding tractlen control to OSC model #voxbox-better-size-control #timelog:00:03:47
This will be a good way to test out tract length while on my Linux box. -- Actually, this won't work. Interpolation doesn't smooth discontinuities, but allows smoothing transitions with floating point values. I will need to build an example.
13:08: Building tract length example instead #voxbox-better-size-control #timelog:00:06:00
13:15: The glitches are there (deliberately), now to see if they can be smoothed #voxbox-better-size-control #timelog:00:30:00
13:45: This is not working as expected #voxbox-better-size-control
I was hoping this would be a quick matter of linear interpolation between the last two samples at the output, but I I'm still getting issues. It's starting to turn into more of a DSP problem than I'd like it be, which is not why I'm here at this program. So, I'm putting it to rest for now.
The attempts are managed in the "tractlen" branch on my voxbox repo
14:50: trying one more thing... #voxbox-better-size-control #timelog:00:18:43
Somehow I think the fractional tract length needs to be considered while computing the scatering junctions. But I'm not sure how yet. I added in best guess but it's not right. Okay I really really need to stop this.
15:21: Reading and notetaking #react-adding-interactivity #timelog:00:19:07
15:43: More reading and notetaking #react-adding-interactivity #timelog:00:18:49
16:40: Attempts to study sourcehut CSS again #htmlize-codestudy-files #timelog:00:15:59
16:46: Ha, okay I managed to make it go. Magic! #htmlize-codestudy-files
16:53: Did some clean-up. I now need to start pre-planning htmlization #htmlize-codestudy-files
16:55: Re-familiarizing myself with the schemas used here. #htmlize-codestudy-files
What does the schema look like? What kind of information does it have? How do file names get translated into things "seen" by dagzet? How to make sure file range links go to the right place? All this and more... later...
16:57: Attempt to quickly change the nose length like tract #voxbox-change-nose-length #timelog:00:23:10
This would be the quick and dirty approach, no interpolation.
17:00: There is a lot of duplicate code between the tract and nose #voxbox-change-nose-length
Including this tract size behavior. I am ignoring that fact right now.
17:06: attempting to integrate with singer sliders demo #voxbox-change-nose-length
17:11: Uh-oh. why is my JS code not working. I didn't do anything?
Oh okay. Just needed to clean some caching and rebuild.
17:22: Things sound right in the example! #voxbox-change-nose-length
The velum now sounds more nasally everywhere pretty evenly.
20:15: Attempt to sequence the first few bars of "Milowda" using vocal synthesizer
Found some sheet music. I want to hear how the sonorities sound.
21:49: Got some initial milowda chords. Attached it to the file "milowda_chords.mp3"
2024-06-20 Day 32. Tongue control in VoxBoxOSC, Agnus Dei
jump to top51 Days Left of Batch
Cracked screen update: I have an appointment tomorrow morning at the apple store. I was thinking about just getting it in and fixed right there, but now I'm thinking I'll just get a quote and sit on it some more. It's going to be a lot of money. The internet tells me it's somewhere in the ballpark of 600-700 USD. I'm starting to think about just getting a refurbished thinkpad (x390). I'm not locked into the apple ecosystem, and I don't really need a powerful laptop for my needs. Also, my m1 works just fine when plugged into an external monitor.
Prev: More VoxBoxOSC Work, playing around with Sourcehut's CSS to figure out how they do line highlighting, reading on React, published my vocal chords demo! https://pbat.ch/recurse/demos/vocal_chords.
Next: || voxbox-tongue-interface: Implement this because it would be nice to have for voxboxOSC. || voxboxOSC: add the rest of the planned parameters (tongue control and gain). || react-describing-ui: Pretty sure I can finish up reading this chapter today. || htmlize-codestudy-files: Get the extracted CSS to go from almost working to working.
08:28: Morning Triage.
08:40: Zulip Check-in.
08:57: Done with morning logging I think.
09:24 Off to get batteries for my scanner wand
10:39: React reading #react-describing-ui #timelog:00:32:22
11:30: Lunch
12:46: Implement gain, add smoothing parameter abstraction #voxboxOSC #timelog:00:21:47
13:08: Time to look at this tongue interface again. #voxbox-tongue-interface #timelog:01:17:47
13:16: I think I might keep the diams vector in for now #voxbox-tongue-interface #create-diams-interface
It's very helpful for tongue control.
13:27: Going to add smoothing to the tongue #voxbox-tongue-interface
this proved to be more challenging than I expected because I was flipping between instantaneous and filtered and I wanted to make sure that doing things like setting the smooth amount wouldn't cause any jumps, which lead to some subtle logic. I think I got it mostly right. Well, good enough.
14:58: Adding more stuff to read to the priority bin
15:00: Exporting these to PDF #react-adding-interactivity
15:35: Reading #react-adding-interactivity #timelog:00:22:29
16:02: Presentations
20:02: Worked out some of the parts in "Agnus Dei" by Artemisia Trio
In an uncharacteristic move, I went back to to the hub at night to work out some of the harmonies in this youtube video I found <<misc/artemisia_agnus_dei>>.
2024-06-19 Day 31. Happy Juneteenth!
jump to top52 Days Left of Batch.
I'm feeling a little bit of pain in my right arm, so I am going to be limiting my computer time today. More reading today, I think.
Prev: Initial code for OSC controlled voxbox, more chords on vocal chords demo. Did not get to HTMLizing codestudy stuff, tongue control in voxbox, or any reading.
Next: Finish up initial VoxBoxOSC MVP, React reading, make the chord demo look a bit nicer (maybe?)
08:30: Morning Triage
08:50: Work on Zulip Checkin
09:06: Okay. Going for a walk. Maybe shopping.
11:20: Oops that was a longer walk than expected.
I accidentally wandered onto a movie set in the historical district of Brooklyn and stayed and watched. They gave me a water bottle, so now I'm part of the Movie Industry now.
11:34: Shopping for food
13:18: Get OSC messages to actually control stuff #voxboxOSC
In rust, this might be slightly difficult with how I am currently arranging things because I have my OSC listener component separated from the DSP component. Ideally, I'd like to have the DSP be a reference inside the OSC component, but that may be tricky.
14:35: We are leaking memory #voxboxOSC
Because the main loop in C, I'm leaking memory when I allocate the DSP object.
14:40: Wait I figured out how to free it unsafely! #voxboxOSC
It involves calling unsafe{Box::from_raw()}
and then drop
.
14:53: okay pitch control works from OSC. Taking a break from this #voxboxOSC #timelog:01:08:19
15:34: How does sourcehut handle source code highlighting #htmlize-codestudy-files #timelog:00:51:30
Initial extraction and studies in scratch/lines_css/
.
16:49: Let's see if I can improve the CSS of the buttons #chords-demo #timelog:00:46:54
20:04: Some react reading. #react-describing-ui #timelog:00:41:05
2024-06-18 Day 30. VoxBox over OSC, more Chords.
jump to top53 Days Left of Batch.
prev: Initial chords demo code works in browser, needs elaboration. Also, C Creatures. I didn't get to any of the other things planned. But, I did have some nice conversations. I guess I'm getting OSC interface working to control VoxBox? Looked into a few Rust and C options. Prototyping something out today.
next: || voxboxOSC work out initial OSC controlled prototype of voxbox. || chords-demo iterate. Functionally it should all work for the most part. Now, I need to build it out and add more chord buttons. || htmlize-codestudy-files: try to find time tobegin work on htmlizing code study files. || voxbox-tongue-interface: This would be good to have for the OSC controller
08:03: Morning triage
08:19: Current gameplan #voxboxOSC
The rust library <<rust/crates/cpal>> didn't work on my alpine linux box, but <<rust/crates/rosc>>. was fine. So, the plan is to build something up using libsoundio, then have the inner loop be implemented in C.
08:24: renaming "rust/libs" to "rust/crates"
08:25: Getting ready to write zulip checkin
08:43: Zulip checkin posted
08:57: setting up some boilerplate #voxboxOSC #timelog:00:19:32
Now we have some C code making sound, and some exported rust code setting the frequency of a sine tone.
09:17: splitting up C code a bit #voxboxOSC #timelog:00:16:34
This code is from the libsoundio sine example. I need to make sure this is organized well enough that I can start putting custom DSP code from Rust, and eventually add OSC polling.
Okay, nevermind on the splitting. I have a better sense of this control flow. I want to make sure I can get my Rust data into C okay.
While I didn't do any refactoring, I did manage to figure out how to send userdata to the SoundIO struct.
09:34: Initial attempts to get a voice working #voxboxOSC #timelog:00:39:50
10:16: Initial attempt at receiving OSC messages #voxboxOSC #timelog:00:17:13
10:35: Now we try to get it to shut up over OSC #voxboxOSC #timelog:00:18:24
11:01: Why are pitches sounding different on OSX and Linux? #voxboxOSC
11:03: I explicitely hard coded a request for 44.1kHz on OSX and that seemed to fix it #voxboxOSC
11:30: webdev/typescript
12:30: Last bits of audio hang
13:00: Lunch
14:45: Expand on chords. #chords-demo #timelog:00:25:56
15:30: Graphics Meetup
16:30: WASM Users Group
17:30: Debugging/Pairing with Carsten
19:30: Home
2024-06-17 Day 29. Start of Week 5?
jump to top54 Days Left of Batch.
I am thinking about next week being the week I send my laptop in for repair. I do have my other computer, a NUC running Alpine Linux, but it's a very specialized computer. I won't be able to do much web stuff, but most Rust stuff will be fine.
Prev: More work on chord demo, studied VCV Rack Potential code using annotation <<codestudy/potential>>, set up code annotation tooling, React reading.
Next: || vcv-potential-study: wrap-up initial review, follow up on TODOs made yesterday (dztodo). || chords-demo: Build out initial interactive demo based on example. || voxbox-tongue-interface: re-work the tongue control interface so it can work with arbitrary sizes. || htmlize-codestudy-files: begin work on htmlizing code study files.
07:28: Some morning reading #react-describing-ui #timelog:00:21:08
07:55: Morning triage
08:22: How to make this more actionable? #demo-react-UI
When I initially wrote this, I didn't really know what it is I was talking about.
The goal here is to be able to use a NameBrand JS Framework to get better at understanding how this tool works. I think when I initially wrote this, I was imagining React to be more of a widget library, when really it's more about managing state of the widgets, and the widgets are somewhere else.
Perhaps what I really want are some nice looking sliders with CSS, and then using React to manage the state of these?
08:38: Writing Zulip Check-in message
09:05: Time to schedule an apple appoint for this screen. Sigh. Goodbye money.
09:19: Well, I forgot the password to my apple ID again.
I don't have any other apple device, so this is a painful experience. Do I even bother?
09:59: Did some Zoom experiments on my Nuc
Surprisingly, it seems to work on Alpine for now. It's nice to know that I can use that as an option in a pinch.
10:01: Getting ready to go to hub.
10:29: Prep-work for rusty-bikes after chat with Binh #study-rusty-bikes #timelog:00:17:00
10:47: I forgot my cable. Tempted to head back home and pick it up.
11:09: And we're back.
11:10: Setting up boilerplate code #chords-demo #timelog:00:41:00
11:51: Talk with Dan
12:44: Back to scheduling a genius appointment for my cracked screen
12:48: More boilerplate code setup? #chords-demo #timelog:00:10:00
13:00: C Creatures
14:00: Lunch
15:32: Initial sounds work, now more sounds. #chords-demo #timelog:01:07:57
16:46: OSC protocol with Paolo
/voc/pitch, /voc/velum, /voc/tongue/x, /voc/tongue/y, /voc/volume
2024-06-16 Day 28. Chord progressions in vocal ensemble, studying VCV Potential code
jump to top55 days left of Batch.
Prev: Added adjustable vocal tract size to singer demo (crude, but it works), establish internal time logging protocol for my logging system, created a linear gesture demo, some sound design work on "vocal chords" demo.
Next: || react-describing-ui: read more from the "describing the UI" chapter from React website. || vcv-potential-study: study the VCV potential plugin code.
14:08: starting this log
14:12: Follow-up: add interpolation for smoother size transitions #voxbox-size-control #voxbox-better-size-control
14:15: This would be a good opportunity to use my code annotation tool #vcv-potential-study
14:17: Pulling in codegen utilities #vcv-potential-study
14:20: Pulling in code files to study #vcv-potential-study
It doesn't have to be all the files, just enough to understand how the plugin architecture is organized. Figuring that out now.
14:28: I have once again installed NERDTree to help me out #vcv-potential-study
I stripped my editor of all features a while ago, now they are slowly creeping back in.
14:46: Gotta make file ranges visisble in HTML output #vcv-potential-study
Eventually, I'd like to generate interactive pages of code, but that can wait.
15:09: Okay let's get studying #vcv-potential-study #timelog:01:01:44
16:24: Trying to make dztodo filterable
16:27: break
16:43: Attempt to make a chord progression using gesture #chords-demo #timelog:01:45:00
2024-06-15 Day 27. Coarse tract size control, Linear Gesture Demo, Better Sounds in Chords demo
jump to top56 days left of batch.
prev: read up on react, finished implementing linear gesture path algorithm, got potential VCV plugin to compile and run, messed around with midinous with Dan and Scott.
next: || voxbox-size-control: experiment with realtime tract size control. || react-describing-ui: read more from the "describing the UI" chapter from React website. || voxbox-linear-gesture-demo: create a linear gesture webaudio demo from the "tom's diner" example I made yesterday. || create-time-log-format: start building out workable code
10:00: Morning triage. #timelog:00:27:00
10:14: Chords demo could benefit from having interactive tract size. #chords-demo #voxbox-size-control
The issue with the chords at the moment is that the voice parts don't sound right, which I think is due to the shapes being wrong. If I am able to tune the size along with the shape, I think I'll end up with better sounding results.
10:20: Thinking about reworking the timelog format here. #create-time-log-format
Right now the syntax is #timelog mm:ss
, but if we include the time in the tag like #timelog:mm:ss
, SQLite would be able to parse it.
10:23: uh-oh, segfault from yesterday's logs
10:27: Okay it works again
I forgot to close out an inline Janet expression.
11:03: Set up some initial boilerplate files for gesture demo #voxbox-linear-gesture-demo #timelog:01:28:37
11:42: oh boy, lifetimes #voxbox-linear-gesture-demo
11:51: maybe I should be using vectors here? #voxbox-linear-gesture-demo
12:03: Trying to rework things so that the linear gesture doesn't need the path upfront #voxbox-linear-gesture-demo
12:33: Okay taking a break from this. Still running into ownership problems #voxbox-linear-gesture-demo
13:45: Did some (re)-reading on the lifetimes chapter, and other research #voxbox-linear-gesture-demo #timelog:00:10:30
13:57: Things seem to build. Now, will it play? #voxbox-linear-gesture-demo #timelog:00:10:29
We have sound! More fiddling with voxbox gesture path required.
14:08: Tempo control #voxbox-linear-gesture-demo #timelog:00:20:38
14:30: Some initial write-ups #voxbox-linear-gesture-demo #timelog:00:19:01
14:51: Uploading to website. #timelog:00:03:33
16:02: Experimenting with timelog querying in SQLite #create-time-log-format #timelog:00:50:31
16:55: Commandline tool "timelog" made. Good enough for my needs #create-time-log-format
18:29: Going to implement crude tract size control #voxbox-size-control #timelog:00:52:41
Units will be in CM, and there won't be any interpolation.
I can't believe it worked haha.
18:52: Okay time to make this stuff sound better in the chords demo #timelog:00:42:48
Things are starting to sound pretty.
2024-06-14 Day 26. Compile Potential VCV Rack plugins, More Gesture Path
jump to top57 days left of batch.
Prev: Thinking In React react-thinking-in-react, Some initial work on a potential chords demo chords-demo, some initial work on implementing a basic Gesture Path Algorithm in Rust implement-gesture-path, infrastructure in my task management system create-dagzet-todo-page. -- Next: More React stuff react-first-component, Continuing building out the initial gesture path implement-gesture-path, try to compile Potential VCV compile-potential-vcv, tract size control if there's time? voxbox-size-control.
09:00: Morning Triage #morning-triage #timelog:00:14:20
09:06: I have follow-ups to this now #create-dagzet-todo-page
see: test-dz-log-task-linking. Calling this one done now.
09:14: Will return back to this. #morning-triage
09:25: Okay Back I think. #morning-triage #timelog:00:29:42
09:35: Working on Zulip Check-in.
09:45: Tract-size control would be helpful for finding shapes #voxbox-size-control
I think the secret to getting a better ensemble sound is going to be tuning vowel shapes for each particular voice type (soprano, alto, tenor, bass). Each of these need slightly different tract sizes. It would be very helpful if there were a way to dynamically adjust the size. No idea if this would work.
10:07: Starting react: your first component. #react-first-component #timelog:00:41:03
10:10: You know what, I'm printing and sending this to my remarkable #react-first-component
I don't want the eye strain reading on this OLED screen.
10:18: Printed all the sections in this chapter, including this one. #react-first-component #react-describing-ui
I didn't realize how the documentation was organized. So, I've created a new task called react-describing-ui.
10:19: Making new chapter tasks, a bit of a tangent #react-first-component
10:25: Getting back to reading, now on RM.
10:52: Okay done with this section. #react-first-component #react-describing-ui
I'll just be tacking on reading progress on this section on react-describing-ui from this point forward.
11:02: Lunch
12:00: Play some guitar
I've been trying to pick out "Fire and Rain" on acoustic guitar by ear.
12:36: Setting up example file scaffolding #implement-gesture-path #timelog:00:14:18
12:54: Trying to work out how path arguments will work #implement-gesture-path #timelog:00:13:59
I think the path argument might just be a reference to an array of gesture vertices, which seems to mean lifetimes. no no, this will be good for me.
12:58: Refreshing myself on lifetimes in the Rust book #implement-gesture-path
In theory, I think I have something that might work. But now I have to get back to interfaces.
13:08: Trying to get the traits working to fit my mental model #implement-gesture-path #timelog:00:41:46
13:51: Let's see what works and what breaks in the example #implement-gesture-path #timelog:00:59:04
Amazingly, it seems to mostly work.
14:40: There's a bit of a logic bug. #implement-gesture-path
Right now the example timings are off.
14:53: Get the example to sing tom's diner #implement-gesture-path #timelog:00:30:20
I realized I had to make sure behavior logic was working as expected (Aval -> (A_dur, A_bhvr) -> B_val)
15:24: It's working! Adding example to this repo #implement-gesture-path
16:03: Off to Hub
16:15: Jam With Dan and Scott, Midinous
20:09: Trying to compile potential vcv now #compile-potential-vcv
Got a make error.
Reading through Rack dev tutorial https://vcvrack.com/manual/Building
Okay this is going to take a while. Sigh. Will get back to this. 20:30 Back at it #compile-potential-vcv #timelog:00:17:22 20:45 It builds, and it seems to open up in VCV Rack #compile-potential-vcv I put the potential repo in the plugins folder of the Rack directory. Also, compiled Rack from source. 21:00 Okay I've seen enough. I'll have to see what I can do to study this. #compile-potential-vcv
2024-06-13 Day 25. Gesture Path initial work, initial work on chords demo
jump to top58 Days Left of Batch.
Prev: Tic-Tac-Toe in React, WASM Users Group, Audio Hang, Rephasor Implementation, Hackers! Also read the intro/preface of "Elements of Computing Systems".
Next: Gesture Path Algorithm implement-gesture-path, "Thinking In React" reading react-thinking-in-react, create vocal ensemble chords demo, chords-demo.
07:19: Morning Triage #morning-triage #timelog:00:19:25
07:27: initial thoughts on chords demo #morning-triage #chords-demo
I'm thinking about building a simple web demo involving 4 voices, reverb, and some buttons that allow you to switch between chords. Probably just 4 or 5 buttons with chord states. It would be cool to be able to switch out chord sounds too.
07:34: Getting ready to write Zulip Check-in
07:46: Final wrap-up for morning triage
07:56: What are we in for for? #react-thinking-in-react
Looks like a small overview page, with some follow-up links.
07:57: Follow-up link tasks created #react-thinking-in-react #react-installation #react-describing-the-ui
(Other tasks linked here).
07:59: Start reading. #react-thinking-in-react #timelog:00:30:12
08:36: off to whole foods
I need to get chip clips (for holding my remarkable in place on easel) and maybe chocolate.
09:30: Back. What to do now?
09:35: planning things for this #create-dagzet-todo-page #timelog:00:12:20
Thinking about abusing my tagging system in my logs to link TODO tasks with dagzet items.
09:39: Added dztodo utility. #create-dagzet-todo-page
It just worked out of the box which is nice.
09:41: just thought up the "timelog" hashtag convention today #create-time-log-format
09:42: attempt to link task to dznode #react-first-component #dz-webdev-react-your-first-component
The "dz" is the prefix for a node in the dagzet. following that is the path, with slashes "/" replaced with dashes "-".
As a proof of concept, I'll need to write a program that can use this log entry to link the task it belongs to the dagzet node being referenced here.
09:44: Testing out linking #create-dagzet-todo-page #test-dz-log-task-linking
I've also created a proof of concept
09:46: The test page will be react first component, linked here #test-dz-log-task-linking #react-first-component
Will follow up on this... at some point.
09:58: Create Voice abstraction #chords-demo #timelog:00:33:14
This will make it easier to spin up voice demos.
Moving Nose to its own file.
Creating Voice file with struct
Adding Voice example. Does this make the unused warning go away? Oh, nevermind. I wasn't making the struct public.
10:34: Break
10:47: Building up an initial chord #chords-demo #timelog:00:20:37
Wow, it sounds bad.
11:20: Lunch
12:20: Better ah sound #chords-demo #timelog:00:59:04
13:36: Tuning to Voce8 Eric Whitacre chord at 21s #chord-demo
https://youtu.be/aynHSTsYcUo?t=21
14:21: Gesture Path Initial Scaffolding #implement-gesture-path #timelog:00:55:03
14:37: Moving over to linux box.
15:39: Let's see if I can get this design right for choosing the next vertex #implement-gesture-path #timelog:00:15:17
16:02: Got an initial trait that compiles. Not sure this is the design I want though. #implement-gesture-path
2024-06-12 Day 24. Implemented rephasor. Hack the Planet!
jump to top59 days left of batch.
Prev: Impossible day! I worked with JB to attempt to implement some audio-reactive effects in Swift. There was no time to do the FPGA stuff I had in mind.
Next: Finish up react tutorial react-tic-tac-toe, start reading the follow-up react-thinking-in-react, start working implementing on the rephasor in Rust.
08:00: Morning triage
08:21: Researching VCVRack + Rust plugins #voxbox-vcv
See: <<rust/potential_vcv>>.
08:29: Uh-oh, janet bug in my day 23 logs
Okay fixed. Glad that was an easy one.
08:31: Ready to publish, writing up Zulip check-in
08:57: Adding my react code up to this point to git #react-tic-tac-toe
09:28: React extension works now! #react-tic-tac-toe
I probably needed to restart the browser.
09:57: Got slightly distracted on Zulip finding things related to career
See: <<career/system_design_primer>> and <<career/interactive_coding_challenges>>.
10:12: Completed tutorial #react-tic-tac-toe
There are some follow-ups that I'm probably not going to do.
10:35: Getting ready to leave for the hub
11:00: Chat with Isaac
12:10: Lunch
12:45: Trader Joe's Shopping
12:58: Nap
13:30: Pre-rephasor work #implement-rephasor
13:33: Working on rephasor on my linux box yay #implement-rephasor
13:34: Going to try out a new thing with how to log times
Going to start a stop watch, and then somehow use the tag system here to log a duration with the event as the start time.
13:35: Beginning initial work #implement-rephasor #timelog:01:00:00
13:37: Boilerplate code from phasor #implement-rephasor
13:38: Copy over rephasor struct from sndkit #implement-rephasor
13:53: Building out initial example #implement-rephasor
14:37: Initial implementation with example done #implement-rephasor
14:40: Back to the Hub.
15:00: WASM Users Group (WUG)
16:00: Audio Hang
17:00: Dinner
18:00: Return To Hub
I found a hard copy of the Elements of Computing Systems and read the preface and intro
18:30: Hackers Movie Night!
HACK THE PLANET!
2024-06-11 Day 23. Impossible Day.
jump to top60 days until EOB (end of batch).
Yesterday I was very tired and out of it. I did a lot of walking over the weekend.
prev: Unblock and reflect workshop, lunch with people (Next Stop Vegan Festivals), some very baby steps into a react applications. react-tic-tac-toe, implemented a simple way to do preset import for singing synthesizer demo quick-preset-export.
next: impossible day! I'm currently planning on breaking my focus up into two parts. Some of my time will be spent working with Jeff B drive some of his video filter with audio. The other part of my time will be spent doing research into FPGA gaming. I'll be studying <<FPGA/openfpga_pong>> and <<FPGA/another_world_5k>>
08:13: Morning Triage
08:32: Getting a tag set up for impossible day #impossible-day
08:34: Composing Zulip Check-in
08:49: Let's see if I can get pong working on my Analogue Pocket #impossible-day
09:16: Some initial talks with JB, Pong installed and running on AP #impossible-day
09:17: Getting ready to go to Hub. #impossible-day
10:00: Examining some code. #impossible-day
11:00: Kick-off for impossible day #impossible-day
11:30: Impossible day work #impossible-day
13:00: Lunch (approx.) #impossible-day
A group of us ate at NAYA.
14:00: Impossible day work (approx.) #impossible-day
At this point, I was very out of my depth. Jeff was very patient with me. Thank you, Jeff.
16:30: Regroup and presentations #impossible-day
We were struggling getting zoom to work with the iOS emulator. But our thing did technically work. Sort of felt like I made a jerk of myself in front of everyone. Oops.
John Cage once asked how he felt about his audience laughing at some of his (admittedly odd) work, to which he replied "I prefer laughter to tears". No tears, only laughter. I suppose I was entertaining. So, I'll take the win.
17:00: A Quiet and Fast and Hopefully Polite Exit Home #impossible-day
I think I am what you'd call an Introvert's Introvert. Being around people at RC is quite enjoyable, but it has been draining for me. I'm really starting to feel it. I'll need to see what I can do to find some quiet and recharge later this week.
Suddenly being around so many people after years of being around so few is an ongoing adjustment for me. Still, it is important to me that I try to push myself a bit and coax myself out of my apartment into uncomfortable situations. The people here are friendly.
20:49: Some logging.
20:50: Attempts at more of this react tutorial #react-tic-tac-toe
21:06: Trying to get react developer tool exstension for firefox working #react-tic-tac-toe
The extension doesn't work on the current iteration of react ("This page doesn't appear to be using React"), but it does work on the React website.
Moving on with the tutorial without it.
21:24: Left off: about to code things so that X's can be written in every box #react-tic-tac-toe
2024-06-10 Day 22. Preset import. React tic-tac-toe.
jump to top61 days until the end of my batch (August 10th)
Prev: Velum/Nasal debugging. Some demos involving overtone throat singing. Some initial preset work.
Next: Preset Import in demo, read up on React, scope out initial Gesture Synthesizer Generator components in Rust.
08:15: Morning Triage
08:27: Oscillator may be less important at this point... #implement-osc
Originally, the idea was to use this is a proof of concept for the WAV writer. Things have moved forward without needing this.
LFOs are still a thing that would be nice to have though, so I'll keep that in mind.
08:29: rephasor sometime this week? #implement-rephasor
Thinking about building a simple gesture signal generator in Rust this week.
08:30: gesture path would be nice this week #implement-gesture-path
08:33: should start my React-ing #react-tic-tac-toe
09:10: Just finished my zulip check-in
09:15: boot up linux box, get internet working
09:58: import ink from previous weeks.
10:14: ink added to git repo and pushed to github.
10:30: Get ready to go to RC.
11:05: Unblock and reflect workshop
12:00: Lunch
14:02: planning what to work on
14:14: scope out preset importing functionality #quick-preset-export
The thing that makes this difficult is that I want the UI elements to update with the internal state. I'd need to build something out to do this.
What's a quick way to get this done?
14:20: setting up some boilerplate, which is as far as I can see right now #quick-preset-export
14:32: Looks like I can simulate events. #quick-preset-export
14:52: Okay, grokking this better. #quick-preset-export
I need to set the slider from JS, then fire off an input event to set the rest of the parameters.
15:03: Glottal controls work, now regions #quick-preset-export
A little bit more challenging, since those sliders are made an older way. Hopefully I an refactor this so all the sliders are made the same way.
15:32: Okay it works #quick-preset-export
I suddenly ran into some weird issues with the textarea textContent
value, so I changed it to be value
instead.
15:34: so textContent is not a textarea property but a DOM node property? #quick-preset-export
See: <<webdev/textcontent_mdn>>. I am not sure how it was working all this time then suddenly not working.
15:43: Uploaded to website #quick-preset-export
15:52: Tic Tac Toe Tutorial #react-tick-tac-toe
15:55: damn, distracted by trying to get a draw game #react-tic-tac-toe
It took a few tries, ended up looking up a draw.
15:57: Okay back on track #react-tic-tac-toe
16:27: Packing up early. getting very sleepy #react-tic-tac-toe
2024-06-09 Day 21. Even more NaN-hunting. Initial preset export. Synthesized overtone throat singing.
jump to top10:32: Compare nose reflections with tract implementation. #implement-velum
10:41: Left Reflection coefficient seems too high #implement-velum
In the rust version, I'm getting 2.0. In the original, it's -0.1.
10:43: wait wait wait, I need to look at right reflection not left #implement-velum
Similar deal actually. The original is still a small negative value.
10:46: Changed velum to 0.0 in reference, similar results #implement-velum
I have my velum set to be 0 in the rust version. So there is definitely something wrong with these left/right reflection coefficient calculations.
10:52: Sum value is very different #implement-velum
It is larger in the reference (~7.3) vs my implementation (1.062).
I had my nose starting position calculated using a relative percentage of the tract length. Maybe this is an unstable way to do this?
10:59: I think I found it. #implement-velum
There were typos in my reflection calculations. Additions that were supposed to be subtractions. Looking up the wrong index.
11:01: We have actual sound! #implement-velum
11:02: Velum control seems to work! #implement-velum
11:03: Reworking demo #implement-velum
I lost the ideal tube shape sound. Going to need to fiddle around with the interactive example.
11:24: Getting velum slider working: Unknown upcode (195) #implement-velum
11:32: commented out panics and debug prints #implement-velum
I think this was getting in the way of wasm compilation.
11:48: turning off vibrato #implement-velum
12:50: Some initial throat singing shapes made #implement-velum
lots and lots of trial and error
13:33: Smooth transitions between overtones #implement-velum
At this point, I'm just having fun. I think the velum is correct haha
14:06: Working on a melodic sequence #implement-velum
14:15: Drown it in reverb #implement-velum
14:42: Throat singing demo works, created mp3 for it. #implement-velum
Now, time for cleanup.
14:57: well, it's a good thing I made an mp3... #implement-velum
My shapes for the throat singing example are ruined once again because there was some code I accidentally commented out.
15:46: Another attempt #implement-velum
16:09: probably good enough #implement-velum
16:13: restoring web example, now with velum #implement-velum
20:27: Coding up a quick exporter for singer demo #quick-preset-export
Textarea with a button that says export/import. Export fills the textarea with the current vocal tract state. Import loads the stuff from the textarea.
20:40: Let's see if I can have one function update the pitch and UI releated to it #quick-preset-export
Probably not going to be an ideal design pattern. I can already see why an architecture like react would be helpful.
20:47: Nevermind, let's just try and dump data? #quick-preset-export
21:15: Uploaded something with just "export", import still a WIP. #quick-preset-export
Good enough for tonight.
2024-06-08 Day 20. More NaN-hunting.
jump to topPrev: Tooling workshop, nose/velum/nasal debugging. Next: NaN-hunting my nose code.
11:29: The NaN hunt resumes, where was I? #implement-velum
11:34: Earliear nan found at position 1365 #implement-velum
There is something wrong with the omega (w) coefficient being computed inside of compute_scattering_junctions
. Peeling that back some more now.
w_r[i - 1]
is fine, and doesn't NaN before w
.
w_l[i]
is also fine, and doesn't NaN out before w
. r[i] is the only variable left in there. Let's see if it NaNs.
r[i] is also fine. huh?
11:38: none of the components for =w= have NaNs #implement-velum
I'm guessing it must be an overflow NaN. Some big numbers somewhere. Going to use dbg!
to inspect the variables w_r[i - 1]
, w_l[i]
, and r[i]
.
11:42: =w_r[i]= and =w_r[i -1]= are both big numbers #implement-velum
Check it out:
[src/tract.rs:183:17] w_r[i - 1] = 2.5360622e38
[src/tract.rs:183:17] w_l[i] = 1.0904602e38
[src/tract.rs:183:17] r[i] = 0.0
So, I gotta figure out why these numbers are getting so large. In my experience, it's usually due to numbers being so small.
I could change the precision to f64
instead of f32
, but this moves the goal posts, and might not actually fix the underlying problem.
11:46: Where would =w_r= and =w_l= be getting such large numbers? #implement-velum
11:57: Multiplying an inf by zero gives you a NaN #implement-velum
Found an intersting thing here. I wanted to break down the components giving me NaN. So I wrote some debugging code that looked like this:
let add1 = w_r[i - 1] + w_l[i];
let add2 = r[i] * add1;
if add2.is_nan() {
dbg!(r[i], add1);
panic!("NAN");
}
The results of this are here:
[src/tract.rs:190:17] r[i] = 0.0
[src/tract.rs:190:17] add1 = inf
thread 'main' panicked at src/tract.rs:191:17:
NAN
12:12: Note to self: =is_finite()= is how to check for infinity checks #implement-velum
If it's false, it's inf.
21:12: Some (hopefully) quick NaNhunting to try and move this a bit forward #implement-velum
21:14: Find when =w_r[i - 1]= and =w_l[i]= become inf earlier in the code #implement-velum
It's going to be in the nose tick function somewhere I think.
Specifically, i is at position 11.
21:31: Large numbers start happening early on #implement-velum
Specifically at tr_jr[nose_start]
.
21:41: Tomorrow's strategy: comparing the numbers with the C implementation #implement-velum
It seems like the reflection coefficients might be causing things to blow up.
2024-06-07 Day 19. Initial velum implementation. The NanHunt begins.
jump to topPrev: game jam. some velum stuff.
Next: logging presentation for tool-time, velum.
07:25: hand-draw site ecosystem, brainstorm presentation format #logging-presentation
08:25: get dot graphs working #logging-presentation
Transfer inked up outline to graphviz dot format. See: scratch/howworks.
Useful guide on writing graphs with dot: <<misc/dotguide>>.
09:00: get initial presentation wiki page going. #logging-presentation
see: how_logging_works
09:06: Logging overview page #logging-overview
09:23: Logging Pipeline #logging-overview
Gotta get images working now...
09:40: Site Ecosystem #logging-overview
09:48: Logging format #logging-overview
10:09: make sure things upload okay #logging-overview
images are new, just want to make sure that code works okay.
10:18: Practice presentation, make tweaks #logging-overview
10:42: Catch up on yesterday's logs
13:06: nose tick function porting: right junction equation looks wrong #implement-velum
Above it is the computation for the left junction. I'd imagine a symmetry between them, but it is not symmetrical. The original voc source code looks like this as well. I think I may have to go back to Pink trombone and check on it. At some point. In the meantime, I've dropped a TODO.
13:32: Hooking nose with tract in tick function #implement-velum
13:49: Try to an initial example up and running. #implement-velum
13:52: Uh-oh. Overflow with the existing example. #implement-velum
It's the damn LCG again in the glottis.
14:01: tract example sounds chipmunky now #implement-velum
going off track a little bit
14:20: Got a nice throat-singing patch going. No nose yet. #implement-velum
But it sounds cool. Might want to import my reverb for added effect.
14:26: Okay it blew up. Fizzled into high frequency. #implement-velum
But the good news is that it did build.
14:27: This is a good enough stopping point #implement-velum
I gotta get ready for a presentation.
15:00: Tool-time workshop!
Wiki page I presented from: how_logging_works.
15:38: Debugging what went wrong #implement-velum
Line by line debugging, comparing to reference C file. Yick.
16:18: Pretty sure the nose.tick() function is the culprit #implement-velum
looking at it again. I have tried to clean up the code to make it more readable.
16:27: Nasal is a NaN! #implement-velum
Must be dividing by zero somewhere.
16:32: NaN checking: no nans in nose reflections #implement-velum
16:34: NaN checking: NaN introduced during scattering junctions #implement-velum
ns_r
and ns_l
both have it (which makes sense). Not sure which one gets it first. Going to check above.
16:42: Finding the earliest NaN #implement-velum
Listing these as I go.
1367: tr_jr[nose_start]
.
1367: tr_l[nose_start]
is NaN before tr_jr[nose_start]
.
16:53: Taking a break from NaNs #implement-velum
NaN-hunting fries my brain.
16:59: NanHunting: A little more digging #implement-velum
So far, I believe tr_l[nose_start]
to be the earliest introduction of a NaN as sample position 1367.
17:07: NaN introduced into tract left 1 sample earlier, after nasal #implement-velum
tr_l[nose_start]
has a NaN at 1366 at the end of the computation loop, after the nasal component is made. It happens right after update_waveguide
.
17:09: Following update_waveguide #implement-velum
tr_l
is copying over tr_jl
with a multiply by 0.999. Where is the last place junction left is begin updated?
17:11: Following last update of junction left #implement-velum
17:17: I have lost the trail. Will start again tomorrow #implement-velum
17:20: Final NaN thoughts #implement-velum
Junction Left at the nose start is the earliest instance of a NaN I have found in my code. At 1366, it is introduced before the nose tick, but after compute scattering junctions.
2024-06-06 Day 18. Post Game-Jam.
jump to topPrev: game jam.
Next: game jam wrap-up. back to implementing velum. thoughts on preset export.
09:34: Morning triage.
09:38: Basically completed with game-jam related things #game-jam
I think I've wrapped up most of my contributions to the game jam. I don't think I can afford to do much more.
10:06: Finished triage, about to make a check-in.
10:27: Where did I leave off with velum #implement-velum
10:34: Nap
11:05: Try to add some scaffolding #implement-velum
11:09: Actually realizing that "nose" is a better name #implement-velum
"Velum" refers to the membrane separating the oral cavity (mouth) from the nasal cavity (nose). The nose is another waveguide. The velum is just a value that controls the opening of this waveguide.
11:29: Some initial scaffolding and comments. #implement-velum
I think I understand enough to start coding some things up.
11:30: Getting ready to go to Hub
12:00: Pairing and level tweaks, game testing #game-jam
13:00: Lunch
14:02: Back at home, coding up some nose/velum stuff. #implement-velum
14:34: Back to Hub
15:00: LOL nvm the rain is too heavy
I was out there for 2 minutes, got to half a block, and got completely soaked.
15:11: Back to working on this at home. #implement-velum
15:21: I forgot about reflections #implement-velum
Things start to get trickier here beacuse I'm trying to separate tract and nose code, but this needs stuff from tract.
15:28: Attempt to go back to hub.
16:00: Presentations.
18:00: Back home.
2024-06-05 Day 17: Game Jam?
jump to topGame Jam.
09:00: More Jamming #game-jam
22:30: End of Jamming #game-jam
2024-06-04 Day 16. Game Jam Day.
jump to topProbably not going to have too much logging here. Today is going to mostly be about getting the mechanics and level designer interface working. Tomorrow will hopefully be more about refinement and design.
07:55: Scoping out gamejam stuff #game-jam
I inked up some of my thoughts, set up priorities, attempts to focus what the game components will be.
09:00: nap
09:32: morning triage
09:32: let's revisit the code #rework-rectangles #game-jam
09:47: it would be nice to get rid of tubePos #rework-rectangles #game-jam
09:51: it's going to take to much time to recode this properly I think #rework-rectangles #game-jam
I guess I'll have to live with it?
09:53: hmm, maybe I can start by trying to introduce global x offset #rework-rectangles #game-jam
10:00: trying to make tubes immutable data #rework-rectangles #game-jam
Instead of updating the xoffset every time, create a new rectangle with global X offset applied.
10:09: attempt to establish more low-level rectangle format in JSON #rework-rectangles #game-jam
I don't necessarily want double rectangles every time any more.
I have the schema, but more refactoring is needed. I want to load the raw rectangle and skip the tube positions (which then copy over to the rectangles).
10:17: might need to eliminate tubePos at this point? #rework-rectangles #game-jam
The JSON file is loading rectangles directly, I don't think tubePos is required anymore?
10:57: JSON rectangle format works! now to generate in lua #rework-rectangles #game-jam
11:00: Game Jam #game-jam
12:00: Audio Hang
13:00: Quick Meet #game-jam
13:30: Lunch
14:00: Game Jam #game-jam
15:20: Brainstorm: what kind of interaction should blocks have? #game-jam
19:11: Brainstorm: some level design #game-jam
2024-06-03 Day 15. (Ooops I forgot it was game jam day)
jump to topThis weekend, I moved into the apartment that I will be staying in for the remainder of my batch. Will probably have earlier hours now.
prev: Made my first interactive singing synthesis demo for the web! See https://pbat.ch/recurse/demos/singer_test.
next: velum, presets for interactive demo, tasks page, C Creatures, more project scoping for voxbox.
08:00: Morning Triage.
08:18: creating some voxbox tasks
08:24: tasks directory work #create-tasks-directory
08:47: adding task descriptions #html-task-descriptions
08:59: Write-up for zulip checkin.
09:20: Getting ready to leave for RC
09:54: velum initial setup #implement-velum
Velum is not implemented in tubular <<voxbox/tubular>>, but it is in implemented in tract <<voxbox/tract>>. Getting tract code tangled and re-examine what needs to be added.
10:12: core issue: how to set size of nose procedurally #implement-velum
The original tract waveguide had a fixed size of 28, which should be a size in CM (which will most likely be expressed as proptional percentage of the tract width in practice).
10:18: figure out current size of nose in CM #implement-velum
The harddcoded size is 28, which I believe is 2x oversampled, so that's 14 samples.
Looks like it is around 10-12cm:
To derive, I opened the equation I was using to convert tract len (cm) to samples in bc:
(len * 0.01)/(speed_of_sound / sr)
or
(16 * 0.01)/(343.0 / 44100.0)
10:33: some boilerplate
10:45: (approx) gamejam planning with JZ
12:00: gamejam kickoff (put velum on hold) #game-jam #implement-velum
I forgot this week was the game jam. My velum work will have to wait a few days.
2024-06-02 Day 14. Initial singing web demo
jump to top15:20: continue setting up DRM sliders #initial-singing-web
16:00: Add glottal controls #initial-singing-web
16:35: Is aspiration actually working? #initial-singing-web
It is very faint.
Ah. Lag was at 0.7 instead of 0.07.
16:58: I cannot seem to set the aspiration value from JS #initial-singing-web
17:11: Okay, why is the pitch slider still working even after I disable it? #initial-singing-web
Ah, chrome was still caching stuff.
17:20: Adding noise floor #initial-singing-web
17:28: Add glottal shape #initial-singing-web
17:39: let's add some smoothing to pitch #initial-singing-web
18:00: Improve start/stop experience #initial-singing-web
18:07: Moves to upload demo to website #initial-singing-web
18:19: it works! #initial-singing-web
21:17: Potential safari mobile fix attempt #initial-singing-web
My friend says it doesn't work on iphone safari. I found a potential fix at <<webdev/audioworklet_safari>>.
21:46: friend tells me it ended up working anyways? #initial-singing-web
I'm reverting changes back to what I did before
2024-06-01 Day 13. Moving day.
jump to top09:04: Tract example tweaks. #implement-tract
09:33: Some write-ups for social media on tract
Here is what I wrote:
I have ported my tract filter to rust now [0]. I couldn't resist garnishing my "simple" example a little bit [1] to make it more musical and sing-y.
In addition to the bare minimum tract filter processing the glottal source, I've also added some controls over vibrato, amplitude, and tract shape (for vowel morphing). I tuned the tract shapes by ear using the distinct region model.
0: https://github.com/PaulBatchelor/voxbox/blob/main/src/tract.rs
1: [[https://github.com/PaulBatchelor/voxbox/blob/main/examples/tract_simple.rs
10:28: Some looks at react quickstart #react-quickstart
10:45: Follow-up: tic-tac-toe tutorial #react-quickstart #react-tic-tac-toe
16:45: Get some singing stuff working in the browser #initial-singing-web
18:00: Dinner
18:57: Is a hslider an HTML element? #initial-singing-web
19:20: How to set up pitch? #initial-singing-web #initial-singing-web
19:36: Set up DRM sliders #initial-singing-web
2024-05-31 Day 12. Picked up keys. Initial working vocal tract.
jump to top09:09: Morning triaging
09:30: Packing workstation up for new place (approximate time)
10:30: Leave for Recurse (approximate time)
11:30: Arrive at Recurse (approximate time)
12:30: Lunch with SW
15:00: Pick up keys to apartment
15:45: back to hub. attempt to do work?
14:15: head home early. forgot cables.
17:27: return to tract #implement-tract
19:15: working demo! #implement-tract
2024-05-30 Day 11. Glottis algorithm in Rust, thoughts about Tract
jump to topMore heads down today. Probably not going into the hub.
Prev: C Creatures, Chats, WASM Group, More Rust Audio Pairing
Next: Implement Glottis algorithm, and hopefully some tract stuff?
09:00: Morning Triage
09:25: Didn't complete this yesterday. Completing today. #implement-glot
09:26: Thinking about tackling some of this port today #implement-tract
I want to push myself to get some singing sounds in Rust by the end of the week.
09:32: Building glot initializer #implement-glot
Struct initialization feels like a pain point to me.
09:55: HP/LP filter uses similar struct, how to do inheritance like things? #implement-glot
Using this as a learning opportunity.
Okay, looks like I want to implement a filter as a trait. See: <<rust/books/rust_by_example/traits>> and <<rust/books/rust_book/traits>>.
10:12: Trying to get traits to work is premature optimization and too much time, making a new task for it. #implement-glot #refactor-glotfilt-traits
I want to set aside some time to actually do this right. It's not quite fitting in my head at the moment, and I need to keep moving forward. Ultimately, what we want is to be able to have a nice way to define two two types: highpass and lowpass, which are two types that use identical data structs but slightly different methods.
I imagine it'll be pretty trivial to return and refactor. So far, this has been my experience with Rust.
10:16: Back to Butterworth porting #implement-glot
These are needed to filter the noise source in the glottal model.
10:18: Oh wait, I implemented the butterworth filter in boing^3 #implement-glot
I'll just copy that over. duh.
10:23: Ugh, how do I use modules locally? need a refresher. #implement-glot
Revisiting <<rust/organizing_code_project_structure>>.
Here is the winning code block for imports.
use crate::butterworth::{ButterworthLowPass, ButterworthHighPass};
Also needed to make sure that lib.rs had butterworth imported too.
10:36: Back to filling out rest of the glot struct #implement-glot
I will need to return to implement the highpass filter.
10:45: implement setup_waveform #implement-glot
11:02: implement hanning table #implement-glot
11:06: implement highpass #implement-glot
11:09: initializer done. now some of the smaller methods #implement-glot
11:15: Port tick function. #implement-glot
11:30: I think there's enough in place for some initial sound #implement-glot
11:37: Uh oh, it panicked. #implement-glot
There was an attempt to multiply with overflow. It seems to be a problem with the RNG.
disabling noise source for now.
11:40: No sound. Debugging in my future. #implement-glot
11:55: troubleshooting: why isn't there any sound? #implement-glot
Here is the debug script:
rm -f glot_simple.wav
cargo run --example glot_simple
mnolth wavdraw glot_simple.wav glot_simple.pbm
ah. I didn't finish setup_waveform()
12:12: Lunch
13:11: Line by line checking with reference #implement-glot
I will log as I go.
setup_waveform: found multiply that should have been add. otherwise, it looks okay.
settuptable looks good. -- rest of initializer looks good. -- tick(): I messed up most of the exp function translations. Rust has a weird notation for this. instead of exp(x)
, rust does x.exp()
.
13:26: It seems to work! Now back to the LCG overflow issue. #implment-glot
Looking at checked multiplies, <<rust/checked_mul>>, but that doesn't seem quite right. I want to do wraparound.
It seems to be okay with this solution:
fn rand(&mut self) -> u32 {
self.rng = (self.rng.wrapping_mul(1103515245) + 12345) % LCG_MAX;
return self.rng;
}
wrapping_mul
was what I needed to keep the panic away. See <<rust/wrapping_mul>>.
14:00: final thoughts #implement-glot
I have finished porting the glottal source component of my singing synthesizer from C to Rust. It was a bit of a tedious effort, but luckily the setbacks were small.
The port can be found here: https://github.com/PaulBatchelor/voxbox/blob/main/src/glot.rs.
Sample code writing to WAV file: https://github.com/PaulBatchelor/voxbox/blob/main/examples/glot_simple.rs.
This implements a glottal source model based on the one described by Hui-Ling Lu in their dissertation. It includes a version of the LF glottal flow derivative waveform model (based off an implementation by Neil Thapen of Pink Trombone), as well as a synchronous pulsed noise component for breathiness and aspiration.
On the first pass manually translating C to Rust, I ran into some issues. Most of the issues ended up being related to the way Rust handles math functions. Instead of doing something like exp(x)
like you'd expect, Rust tends to use notation like x.exp()
, and I messed up the notation.
By itself, the glottal source component sounds pretty unremarkable. It will need a tract filter before it starts to sound talky.
15:09: publish glot algorithm to website.
15:27: tract scaffolding #implement-tract
15:38: Using tubular. Already realizing some porting issues. #implement-tract
Tubular <<voxbox/tubular>> is the port I'll be basing this off of, and already I can see I did a lot of pointer stuff to manage buffers. I'll probably need to peel things back a little bit and remember what I was doing before going forward.
15:56: Refamiliarize myself with Tubular. #implement-tract
16:00: Actually got unstuck, getting initial struct initialized #implement-tract
16:20: Initial struct made? setting up other bits in init #implement-tract
16:29: tick scaffolding #implement-tract
16:37: generate_reflection_coefficients #implement-tract
16:46: compute_areas_from_diams #implement-tract
16:49: compute_scattering_junctions #implement-tract
17:00: closing up for the day
2024-05-29 Day 10. Glottis Scaffolding, some realtime audio in Rust
jump to topPrev: Audio Hang, Software Foundations, Graphics Hang, Rust
Next: Port Glottis Algorithm to Rust, C Creatures, WASM Group
09:00: Morning Curation
09:43: Some initial work on glottis scaffolding #implement-glot
10:00: Pack and leave for RC
11:00: C Creatures
11:44: Paul writes a log for a demo in C creatures
12:00: Whiteboarding with JZ
13:00: Chat with JM
13:30: Lunch (Attempt 1)
I attempted to quickly find food before my next appointment. There was not enough time, and ultimately gave up. Fortunately, there was a bagel in the kitchen.
14:00: Chat with EB
14:30: Lunch (Attempt 2)
Ate in the food court.
15:15: WebAssembly Group
16:00: Rust pairing with Dan
looked into mutexes. <<rust/mutexes>>
17:48: audio code works! using tinyaudio now
see @(dzref "rust/tinyaudio")!@
2024-05-28 Day 9. Rust code organization.
jump to topPrev: Added my task management system to my static site generator taskgroups. Ported phasor algorithm to Rust https://github.com/PaulBatchelor/BoingBoingBoing/blob/main/src/phasor.rs, paired with Dan learning how to use cpal <<rust/cpal>>.
Next: Audio Hang, Software Foundations (Lurk), Graphics, migrate DSP code to voxbox, begin work on glottis algorithm port.
08:38: Thoughts during breakfast.
I started with the general question "what is it that I'm trying to do here?" and worked from there. It ended up being a good way to articulate why I chose to do my singing synthesizer project.
I'm trying to find a job. I'm using my time here to diversify my skills to something less niche so I have a better time finding relevant jobs. My career up to this point has involved a lot of "legacy" systems that nobody is hiring for (unless it's a super senior position).
The project I have chosen here involves experiments in singing synthesis that run in browser. This to me has a good balance of familiarity (comfortable) and unfamiliarity (uncomfortable). Audio programming, singing synthesis, and DSP algorithms are comfortable things for me. The regions of the uncomfortale and unfamiliar lie in how I am doing these things. Namely, Rust (to webassembly), and just about anything related to web development.
I think my starting point is learning Rust. Even though it is quite complicated, it is the shortest distance away from my zone of familiarity, since this is where I'll be doing the audio DSP work. My first few weeks here will be spent building the core DSP kernel.
09:13: Morning triage.
09:55: This is in a good enough place. Follow-up tasks have been created. #htmlize-tasks
09:56: Resume building can happen next week I think #resume-setup
09:57: Hoping to start the glottis port today. #implement-glot
This is the algorithm I'd like to use: https://git.sr.ht/~pbatch/mnodes/tree/master/item/glot/glot.c.
09:58: It's trivial to add an mtof function. Doing this today. #implement-mtof
09:59: Would be nice to try and get a vs code setup today or tomorrow #vscode-rust-setup
10:00: Pack up and leave for RC
10:59: Arrival / Puttering on Couch
11:17: Piano / Wander
12:00: Audio Hang
13:00: Lunch
14:06: Software Foundations
15:11: add re-useable mtof utility to boing^3 #implement-mtof
15:16: glot pre-setup: import wavwrite #implement-glot
This is hard for me because I don't actually know what the best practice is for organizing library code.
Found this: https://rust-classes.com/chapter_4_3
15:30: Graphics Hang
16:30: Home
20:00: Rust code organization
Basically, wanted to see how Rust projects tend to split up components in a project (rather than one lib.rs file).
Glanced at some docs for this <<rust/organizing_code_project_structure>>, and <<rust/bringing_paths_into_scope_use>>.
There doesn't seem to be any good way to have shared code in examples that isn't included in the library. Something like monowav <<voxbox/monowav>> isn't something that really should be a publically available functionality, but it's really useful for rapid prototyping. It's also too small to make it a whole other project. For now, I'm just leaving it in there.
2024-05-27 Day 8. Memorial Day.
jump to topPrev: Setting up HTML generation on recurse wiki and dagzet.
Next: Setting up tasks. Phasor DSP code in boing^3. Start transferring some code to new VoxBox project.
07:47: Tea and Morning Triage
07:53: create generate_all script
08:19: Is this even actionable? #plan-reading-schedule
08:34: back to task parsing, where was I here? #task-sqlite-gen
08:38: Right, time for thinking about SQL schemas #task-sqlite-gen
08:48: Also trying to make the text files they are in a kind of group #task-sqlite-gen
09:06: get insert statements for tasks working #task-sqlite-gen
09:10: hook up to rest of files #task-sqlite-gen
09:15: Again with the string escaping in SQLite #task-sqlite-gen
09:19: Quick walk outside
09:48: Breakfast
10:19: Begin work on tag parser #add-tag-parser
Going to need this if I want to connect logs to specific tasks.
10:39: Tag extraction works, now SQLite statements #add-tag-parser
Will need a new table for this. I also will want to reference tasks specifically, so I need to make sure I can get the right rowid from the logs row.
10:44: Mostly safe to a assume day/time combo is unique #add-tag-parser
I don't have a formal constraint for this, but this is a reasonably safe assumption. I'll use this to look up the rowid.
10:45: Actually, would max(rowid) work? #add-tag-parser
The tag parsing is always done immediately after inserting the log entry to the table. I assume it auto-increments and is the largest? This would be much easer.
10:56: Let's see what works? #add-tag-parser
It works!
select tag, logs.title from
logtags inner join logs on logs.rowid = logtags.logid;
11:01: I wonder if org can be parsed in my logs? #codeblocks-logs
This is some bold text.
11:07: Code blocks don't work as expected yet. #codeblocks-logs
This is because lines by default get their lines truncated.
11:10: Going to try using a new divider to preserve line breaks. #codeblocks-logs
11:39: Code blocks work. That took more time than expected. #codeblocks-logs
My org parser is very strict about whitespace, and I had to be a little more involved about how to extract blocks here. It works though.
11:44: Attempt to aggregate logs of a certain tag #connect-logs-to-tasks
If the tag belongs to a task, then this is a comment history.
This seems to work:
SELECT day, time, title, comment from logs
INNER join logtags ON
logs.rowid = logtags.logid
WHERE logtags.tag is "add-tag-parser"
ORDER BY logs.rowid ASC;
11:53: Shoot, comments aren't breaking up right
12:02: comments break up okay now I think
The - - -
(no spaces) is something I'm using as a divider. I reworked the logic so that it's included in the SQLite data, and then handled when the HTML is rendered.
Previously, it was part of a pre-processor that actually got filtered out before the SQLite table.
12:06: Creating "hist" tool #connect-logs-to-tasks
This will allow me to view a timeline of logs given a task name id.
12:27: All core ingredients acquired for HTMLized task views #connect-logs-to-tasks
12:30: Lunch
13:12: Begin generating initial HTML pages for tasks #htmlize-tasks
This will be similar to what I did for dagzet. Create one page for each task, and give it a prefix like tasks/htmlize-tasks
, which then displays a timeline of logs.
13:49: setting up initial janet code #htmlize-tasks
14:13: now to generate task groups pages #htmlize-tasks
I was thinking this could be like taskgroups/priority
, with taskgroups
display a directory of all the groups.
14:15: Setting up initial taskgroup pages #htmlize-tasks
14:23: Janet taskgroup renderer #htmlize-tasks
14:38: Generating grouptasks directory page #htmlize-tasks
This will just be at taskgroups.
14:42: janet code for taskgroups #htmlize-tasks
14:50: tasksgroups work. basic functionality complete. now what? #htmlize-tasks
What's left here is to add descriptions to tasks and grouptasks, and also to add ordering to the task group lists. But hey, this is good enough for now.
14:54: update site generator script to include new subdirs
15:21: Getting prepped for phasor-work #implement-phasor
15:35: Implement phasor example #implement-phasor
15:45: It works! #implement-phasor
I have a slow phasor modulating the frequency of an audio rate phasor, which is getting filter by an LPF.
15:50: Logs and example file: #implement-phasor
Here is the phasor implementation: https://github.com/PaulBatchelor/BoingBoingBoing/blob/main/src/phasor.rs.
Here is the example:https://github.com/PaulBatchelor/BoingBoingBoing/blob/main/examples/phasor.rs
2024-05-26 Day 7. Recurse Wiki is born. Initial log generator plans
jump to topSunday. I'm starting to think maybe just build a recurse weewiki? In many ways it's the path of least resistance.
08:57: Initial weewiki setup #htmlize-knowledge-tree
09:07: Create HTML export script #htmlize-knowledge-tree
I don't actually have a boilerplate script for this.
09:24: extract dagzet tools from brain wiki #htmlize-knowledge-tree
Now that there's a weewiki here, things should be more straight forward.
09:50: Things work #htmlize-knowledge-tree
09:57: Generate sqlite code from logs #htmlize-logs
I'm not going to worry about the tag system here, that can happen later. I just want to be able to add the timestamps and messages.
10:16: Create log db generator script #htmlize-logs
10:30: Trying to get line breaks to work in comments #htmlize-logs
This would be one line. This would continue the same line because there isn't an empty line.
This would be another line.
It looks like the parser I wrote ignores the extra line breaks. Rather than try to update the parser, I'm going to say that a line "
" will insert a line break, which makes sense.
10:41: Create an initial logs page #htmlize-logs
11:33: Ran into debugging hole related to checking for valid key #htmlize-logs
I should have been using an "or" operation instead of "and". Funny how I keep confusing the two.
This whole rabbit hole started because what I was calling "comment" in the time events, I called "blurb" in the dayblurb. I was accidentally trying to get a comment from a dayblurb instead of a blurb. It was returning nil, and it was still considering it valid. I had to update the conditional logic.
11:36: Back to getting a single day printed using weewiki #htmlize-logs
11:38: Now to render all days on one page programatically
Get all available dates, and then render each one by day.
11:48: Okay why is day 3 stopping midway #htmlize-logs
I'm hoping it's not a weewiki thing because that would be tedious.
11:51: Seeing if static export works #htmlize-logs
11:52: What is going on with day 3? #htmlize-logs
11:54: Okay, I think it's string escaping issue in janet #htmlize-logs
12:00: I have no great way to escape the inline janet syntax. #htmlize-logs
In weewiki, I use @!(foo "bar")!@
to do inline janet stuff, but in Day 3 I was using that in another context. The workaround is to put it in a code block.
12:03: Is static export broken for dagzet?
12:06: Ah, it seems I wrote a one-liner to generate all directories needed
12:08: Export script updated. Things work now.
12:09: Try to upload to website?
12:13: uh-oh.
I shouldn't be nested git repos like this
12:18: upload script works, now to add jump links for each day #htmlize-logs
Adding for each entry would be cool too!
12:37: Now to add timestamp links #htmlize-logs
12:46: Note to self: do not render raw HTML in an org block #htmlize-logs
12:50: Things seem to work now #htmlize-logs
12:53: dagzet HTML isn't correct on mobile
13:03: Looks like I accidentally removed the viewport width HTML metadata
15:37: Planning and wiki-ing
16:02: initial setup stuff for logs to sqlite data #task-sqlite-gen
Just getting the boilerplate code and parser hooked up.
16:27: I think I have most of the data being parsed #task-sqlite-gen
2024-05-25 Day 6. Saturday. Not too much work.
jump to top14:50: Add ink from last week to repo
15:05: Set up ad-hoc workspace
This table I'm currently on is just slightly too high.
15:36: Update dagzet database
sidetracked... took a picture of my new setup and posted it to mastodon.
15:44: I need a better way to know which files aren't in the database yet
Refactoring the script now...
16:00: Done for now
2024-05-24 Day 5 planning VoxBox, monowav in Rust
jump to top08:50: Wrap up logging from yesterday
09:00: Write checkin
Prev: more voxbox scaffolding, wrote a simple wav writer in C, presentations Next: start writing wav writer in rust, plan out DSP layout in voxbox and make tasks for them, building your volitional muscles
09:27: Ink out voxbox potential DSP layout :voxbox-dsp-tasks
09:54: Refill SS fountain pen
It kind of exploded. Hoping it will be okay.
10:16: Getting ready to depart for hub
11:10: Arrival
11:15: VoxBox DSP planning (ink) #voxbox-dsp-tasks
12:04: Volitional Muscles Workshop
13:00: SW Neural Encoders
14:30: At workstation
Got into talk with someone
14:44: monowav pre-setup #implement-monowav
just some scaffolding
14:52: adding links to knowledge graph
14:55: some more boilerplace placeholder
15:00: ???
Got pulled into various conversations.
16:44: back to computer
Doing some messaging on zulip.
16:57: Update events in my log
17:03: pairing with Dan
18:32: it works
2024-05-23 Day 4. mono WAV writer in C
jump to top09:00: Morning Logging
Raining today, might go into the hub later.
09:48: VoxBox project setup thoughts #rust-proj-setup
Yesterday I pushed initial code: https://github.com/paulBatchelor/voxbox Used cargo to generate a library, with examples, similar to my boing^3 library.
09:51: oops guess I need to add the rest of the files #rust-proj-setup
Also setting up my adhoc workstation while doing this...
10:01: Troubleshooting trackpad mouse
While setting up my adhoc workstation, my trackpad decided not to work. It was a faulty microusb cable. Tossed that in the trash.
10:07: Back to adding more files to voxbox
10:12: simple WAV file generators: didn't I do this already? #implement-monowav
I feel like I wrote a bare bones mono wav file generator somewhere in C.
10:18: can't find it, guess I'm doing it from scratch
going to write a quick and dirty one in C, then port it to rust. Reviewing the spec now.
10:54: Packing up to leave
Got side-tracked with Dan during this.
12:12: Arrival
12:33: Getting ready to work
12:34: filling out initial chunks #implement-monowav
12:44: Okay, poking at the hex data for the header #implement-monowav
12:56: main chunk done, now data chunk
hex viewer is nice. I'm assuming it's only going to be one data chunk with PCM. What I'm not sure about is: how does it know to do 16-bit and not like 10 or 12 bit?
13:11: wav written, but afplay doesn't like it #implement-monowav
13:14: Inspect riff data of reference
I figured there'd be a tool for this. exiftool looks to be about right.
13:16: why is the exif data saying my WAV has duration of 0? #implement-monowav
13:26: Time to generate a smaller reference file #implement-monowav
13:37: now we're only writing zeros #implement-monowav
13:44: why is there weird junk in the reference zeros? #implement-monowav
13:49: Don't know what the junk is, but the header is wrong #implement-monowav
I used sox to convert my generated wav into another wav, and it fixed some things. I diffed the xxd'd output and saw the header differences.
13:57: 36 byte difference between data payload chunk size and riff chunk size #implement-monowav
4 bytes for RIFF 16 bytes for fmt 16 bytes for data in riff?
14:04: zerod wav works. now add sound. #implement-monowav
14:08: off to lunch. next steps #implement-monowav
next steps: get dynamic file size working.
15:07: Return from lunch. settled in. #implement-monowav
Now, to tackle the file size things. I'm anticipating when I port this to Rust, I'll want to be able to write the stream of an indefinite size, get the total samples written to disk, and then update the header bits. It would be nice to do that using someo kind of r/w mode. Otherwise, just close and re-open.
15:24: number of bytes acquired dynamically, now to update WAV
My Orphium project is updating stuff in-place, which flag is it?
Ah, appears to be "a+".
15:26: it seems to work without updates? #implement-monowav
Nevermind, just didn't update everything, and had to go back to the beginning of the file. Be kind, and rewind().
15:28: back on track #implement-monowav
15:43: It works!
Making it a commandline application to take in input/output params.
15:46: "a+" doesn't actually work like I thought it would
I just looked up the man pages. "w" is fine.
15:48: things look good. next steps.
Next steps are to port a version of this code to rust as a mono wav writer to boing^3. After that, it should be ported to voxbox with some sine code to make sure I can get WAV files generated.
15:50: packing up, getting ready to watch presentations
16:00: Apple 2e MIDI, rest bug, coltrane timing
16:10: Neural nets as a musical instrument
DIY auto-encoder, RAVE: realtime auto-encoder, RAVE on Daisy (hey I know these guys!) (didn't work), on norns? Maybe I can help with the norns stuff?
16:15: Evlish PL and shell, testing
16:30: How to never miss a train
16:40: Everything is an Object
pharo smalltalk VM
16:46: Music tracking for the privacy conscious
Navidrone, clerk (clojure)
16:52: Debugging and logging in vscode
"tech demo state"
16:56: PSAs and how to think about time at RC
Use checkin stream, show links in zulip. "closed" on monday. kind vs nice: try to be kind.
17:15: head back home
2024-05-22 Day 3. Laptop screen broken. What now?
jump to top08:40: off-by-one is going to confuse me
renaming logs so far
08:47: reading RC jobs advice #rc-jobs-advice
top 3 skills/attributes that I want to cultivate during my time here: Effectiveness, Agency, Conscientiousness
09:11: Setting up high-priority task list
09:17: Document task lists
It's already blowing up. I'm going to lose track.
I should add a description metadata field to the task lists. that way it stays with the task.
09:22: Metadata to task lists #add-meta-parser-tags
09:25: Created metadata tag parser task #add-meta-parser-tags
Proof of concept should be to add @!desc
as a known tag for description
11:15: Laptop screen broke
looking up how to fix this.
13:07: Settling down at a space
13:25: Walk/Lunch
Went to the apple store, which happens to be about a 15min walk away from the hub.
14:31: Course correction
15:20: getting ready for isorhythms study
I want to build and run this locally, and document my steps for this. I also would like to take a closer look at the JS involved and see if there's any way I can improve this.
15:22: attempting to clone and recompile
15:40: Recalling what the JS code does
Got into some discussions while this was happening
16:42: Initial VoxBox scaffolding
16:52: Break
17:00: non-programming talks
18:00: travel Home
2024-05-21 Day 2. Some initial system setups, various meetups
jump to topDay 2. Still remote.
08:35: Task system
I'm attempting to think up a simplified version of my zetdo system for use with RC. zetdo is built on top of weewiki/zet, which has too many intermediate parts. Ideally, I'd like to build up a simple markup that can be parsed and compiled into data for further processing later. Tasks have the following parts: a sentence describing the system, an optional description, and then a list of timestamped comments that can be arranged as a thread. Occasionally I group tasks into lists, but I haven't done anything complex enough to warrant a formal dependency system. For now, using my "descript" notation could be enough. This would allow me to easily build uniquely named blocks where each block could represent a task. > @foo-task this is a foo task > this is a foo description My timeline notation can then link to tasks using tags > @2024-05-21 > @08:48 I am writing a test comment for the foo task. #foo-task
08:51: Add some initial tasks #task-system-setup
What to do about tasks that are done? Maybe not think too hard about it and move to a done.txt and a nevermind.txt for tasks that I won't do.
09:01: oops update descript #task-system-setup
There was a bug in my parser that I fixed previously.
09:03: potential schema idea for tasks? #task-sqlite-gen
It seems like for groups/states, I'll be putting things into files. So, current tasks are in main.txt, done tasks are in done.txt, cancelled tasks nevermind.txt, rust related things in rust.txt, reading in reading.txt, etc. Filename could be a meaningful parameter in the schema.
09:06: Initial rust reading list setup. #setup-rust-reading-list
09:16: Reading: how not to learn rust. #how-not-to-learn-rust
Attempting to take notes while I read.
Useful for autocomplete tags in Vim https://stackoverflow.com/questions/10789430/vims-ctrlp-autocomplete-for-identifiers-with-dash 'iskeyword+=\-' was good enough for my needs.
I'm realizing that I may want to spend some time getting used to how Rust is designed, instead of jumping into code (though I already have some boilerplate ready for porting DSP). I think overall things will be faster if I figure out how to think in Rust, rather than trying to apply my pre-existing notions of programming towards Rust.
Seems like some decent wisdom. There are many links and follow-ups, which may be worth looking at later.
10:04: Break/Snack
10:30: Walk
11:00: RC Software
11:45: Nap
12:35: VoxBox Initial planning :project-outline
13:00: Break
13:11: Update Log
13:13: Skipped the pairing session
I'm feeling a little bit bad about this, so I thought I'd write a few words on it. Firstly, I'm exhausted. The time between my last job and living situation and the start of my batch was too short. I knew this coming in. I'm adjusting to what are for me have been some huge life changes in the last week. Secondly, and perhaps more importantly, I think I'm also a little intimidated? Imposter syndrome, etc, etc. I've taken about a six month break from writing code. Lots of reading and research around the code. But coding itself has been fairly sparse outside of a few ad-hoc scripts. I also doubt my skills a fair bit. I really only know a modest amount of C, along with a handful of scripting languages on top of that. DSP and realtime audio puts a limit the kind of languages you get to study. Most languages used today are simply not suitable for realtime or soft realtime applications. As a result, I've never had many opportunities to work much with other languages. It's hard having that feel like the only language you feel only sort of comfortable in. I guess I wish I had more go-to "modern" languages that are less niche. I'm hoping I'll be better at getting over myself in the next few weeks.
Update: languages used in the pairing workshop: Javascript, Java, Scala, C/C++, Clojure, Rust, Lisp, Racket, MIT Scheme, OCaml, Ruby, Python, SystemVerilog/Hardware, Elixr, Erlang, R, Golang, bash, csh, Kotlin, C#, Julia, Swift, SQL, Zig, Awk, Elm, Nim.
13:24: Update Rust
Looks like this is just "rustup update"
13:27: Revisit BoingBoingBoing
Can I get this to build and run still? Do I understand what is happening?
Do I have examples, and if I do, can I build them with Cargo? I think there's a way, and I'd like to look it up the slow way (instead of stackoverflow: find a good book online, then find it inthe book)
13:40: Update boing^3 README for example instructions
13:55: Break
14:00: Software Foundations
I made a dagzet for the book at "knowledge/software_foundations.dz"
14:43: attempt to make initial db generator
Adapted from my mkdb script.
Want to group things by top-level namespace, so I want to select up to first (nth?) "/", https://stackoverflow.com/questions/21328295/sql-with-substring-to-a-special-character-with-sqlite
Select distinct top-level nodes from dagzet > select distinct(SUBSTR(name, 1, INSTR(name, "/") -1)) from dz_nodes Curious if I can get secondary ones.
15:00: Looking up career info
Making a "career" knowledge graph.
15:05: Break
15:35: Graphics
16:36: Break/Wrap-Up
2024-05-20 Day 1. First day of recurse. Organize.
jump to topFirst day of Recurse! Figuring out how to best organize myself. I want to use systems I already know, while also keep the data out in the open and readable from GH while to keep myself accountable.
11:00: Initial Welcomes
11:10: Staff Introductions
11:15: Framework for Learning while at RC
11:28: Pillars of RC
11:40: Social Rules
11:49: Meet and Greets/Breakouts
12:32: Walk/Lunch
14:00: Advice and Introductions
14:55: Break
15:01: Ink / Pairing with Dan
TODO: look at NIH plug for Rust plugins
16:49: Writing this initial timestamp log
I am currently writing this first log, then adding it to the repo.
I need to import my tools and make sure they can work outside of my repo.
15:51: Workflow Planning :ink
16:16: blank
16:54: setting up evparse
17:04: transcribe event log for the day
17:10: blank
17:22: transcribe dagzet
Gotta get dagzet utility working.
17:41: scan and upload ink for workflow planning
This involves creating a rotation utility.
18:02: end
Most of them are more of the same, so that's a good starting point
Improved testing: moved unit tests to separate file, added some additional checks using SQLite in the test.sh shell script.
Added au and im commands