Earlier this week I was writing some code that used a statically linked C library, and I was trying to use a default the library documentation used in its examples, DEFAULT_CAMERA_CONFIGURATION.
I was able to use everything else in the library itself just fine, but when I’d try to use this constant, my program wouldn’t compile and give a linker error instead SYMBOL_NOT_FOUND.
When you compile code into a library or an executable, the names (variables, functions, constants) are called symbols.
RC Day 3 Not that much to talk about today.
In the morning, I had a minor epiphany about wasm importing this morning after some sleep. rust-wasm-template makes an npm package, and create-wasm-app can import it directly as JS, whereas rust-wepback-template makes a wasm bundle that your js has to import like wasm. *I think*, and I’m not sure yet, that’s why I’m trying both.
I created an event tomorrow afternoon to meet Rust interested people.
RC Day 2 Note: This post isn’t going to be as informative, more stream of thought/work-like.
I wasn’t as productive today. I didn’t sleep well last night, and pretty much didn’t move on my Rust/Wasm project. I also tired myself out talking&pairing a lot (but I want to get practice!)
There are chatbots that match you for pair programming and for coffee chats, and I signed up for both of them.
The morning and RC itself The first half of the day was overwhelming because there were a lot of people to meet and talk to. But RC is really nice! Lots of nice people, and nice in a way I hadn’t experienced before.
First, there were a few intro speeches. I really liked the Social Rules sketches because it’s one thing to hear “no well actuallies”, it’s another to watch an example play out.
Including The Most Demanding Stackoverflow Question I Have Ever Seen, and Rewriting it in Rust
The other day I found this post on the Domino Data Science blog that covers calculating a PCA of a matrix with 1 million rows and 13,000 columns. This is pretty big as far as PCA usually goes. They used a Spark cluster on a 16 core machine with 30GB of RAM and it took them 27 hours.
I read up a bit on PCA and realized you can do PCA on large (several billion element) matrices much faster and without using any Big Data tech like Spark by using better algorithms and more RAM.
This post is from my old blog. It’s about a weekend project where I downloaded a bunch of football match data and did some light analysis of it. I had further plans to use it for “Machine Learning” and try my hand at a prediction engine, but I didn’t get that far. Sadly, I couldn’t get the pictures back. It’s a great example of where I was 4 years ago and reminds me of the progress I’ve made.