Dusting Off MySql
- steincarl
- Feb 7, 2022
- 3 min read
Getting Started
On 13 Sep 2021, I purchased The SQL Workshop from Amazon using the company card. All of my tech/data/programming books are paid for, which is greatly appreciated.
I got through the first two chapters, which touch on creating tables, data types, column definition, and CRUD operations. I did the exercises that reflected all of those processes to create a basic database for a fictional online shop, with the Date Data database always in the back of my mind.
A few pages into Chapter 3, “Normalization,” the author advises:
“Take a break now and let all the information you’ve looked at sink in a little."
Distractions
Unfortunately, my break ended up being about 4 months. Interests veered and priorities shifted. In the time away from “The SQL Workshop,” I did continue reading, just not it.
I did finish:
The Nimble Elephant by John Giles, all about data modeling and best practices. His _The Elephant in the Fridge_ was one of the first books I read after starting at my position in late 2020/early 2021.
An Introduction to Agile Data Engineering Using Data Vault 2.0 by Snowflake’s boi Kent Graziano.
The MIT Press Essential Knowledge Series: METADATA by Jeffrey Pomerantz. This book was hella information for a top-level understanding of where metadata comes from, how we encounter it on a daily basis, and what can be gained from metadata analysis.
Reading as a Routine
I read “work books” in the morning, for approximately 20-30m, ideally after a yoga session. I’m currently doing some sessions offered on the app Asana Rebel, which is a step up from basic, free YouTube vids (MadFit).
Having to squeeze in a few pages in the morning is now as engaging and routine as the fitness that comes before it.
I’ve (willingly) read more over the past year than I had in the past ten (10) combined. Maybe that’s hyperbole. Always a gamer, sometimes a reader.
Back in MySql
So, now I’m back starting up my relationship with MySql, the DateData project, and journaling.
In another tab, the “How to Create a Database, Add Tables, and Import Data in MySql Workbench” video is paused about halfway through.
There has been consistent brainstorming on /how/ and /where/ to get started on this data base. I have stalled a lot. I have considered and reconsidered so so much. Regrettably, I failed to document much of these thoughts. This journal is me making the effort to do better.
I picked a starting point. A fact table.
Fact tables should be thin and long. Few columns, lots of records.
The primary fact table will be: Occurrence
CREATE table occurrence
(
OccurrenceID INT AUTO_INCREMENT,
OccurrenceDate CHAR(10),
OccurrenceYear CHAR(6),
OccurrenceDetail VARCHAR(750),
PRIMARY KEY (OccurrenceID)
);
OccurrenceID should probably be a randomized GUID.
Did I need to make its data type different if that's the case?
OccurrenceDate will just be date and month. Every entry has a date and a month, that is literally the sole requirement of a record.
OccurrenceYear can be null, and I think I needed to indicate that on table creation.
OccurrenceDetail is the description of the event. Entries can be lengthy AF, so I wanted to make sure the character limit was significant.
Finishing Up
The final part of this afternoon’s session will be importing columns from Excel into DateData, and I believe the 2nd half of the above-referenced YouTube video will address just that.
I don’t anticipate getting much further on the database than that today.
Pending energy levels, I may then dive into my LIST OF DATES notes to make headway on entering events into the master xlsx.
After Some Troubleshooting…
My CSV file keeps barfing. There are ~190 records, but I’m only getting
~145 upon import. What is causing the import to break?
I did a replace of all of my quotes (“) into single ticks (‘), thinking that was part of it? It did not seem to affect the number of imports.
I reviewed the CSV in VSCode and Mac TextEditor, and I don’t see any foreign characters that would break import.
Is it the VARCHAR (750)/TEXT data type that's too much? How could that be "too much?" The whole file is like 25kb. Is it incorrect?
Frazzled!!
I’m keeping the table, but blowing away all records until I can get a wholly accurate import.

Sunday Evening
I’m making a beef stew for dinner. It had been my intent to find a recipe for like, the past few weeks. But days have been cold and dark and short and I’d been unmotivated.
Did get it together to have picked up the ingredients on Friday, so today’s dinner should be hearty as hell.
Comments