I find that all the best blogs start with a disclaimer, and so on that basis, this is now one of the “best blogs”. To run my Idea Capture demo, I’m going to talk about using consumer medical equipment off-spec, which you really shouldn’t do. If you do decide to play along, I’d love to hear how it goes… but that detracts from the point of the disclaimer, which is that I advise you not to do what I’m doing. Ok let’s begin…
As someone who navigates the lively and slightly chaotic landscape of thoughts brought on by ADHD, I've come to realize that capturing ideas is a fleeting opportunity.
The brilliance of a concept shines brightly for mere seconds before it vanishes like mist in the morning sun.
This isn't unique to me; it's a universal experience that many can relate to.
During a recent conversation with a friend, I discovered that this "loss of idea" is more common than I had imagined. It led me to share my personal strategy for capturing my favourite ones of these ever-present thoughts: using Apple Notes and Microsoft 365 Copilot.
Apple Notes: The Digital Net for Catching Ideas
Apple Notes has become my digital net, allowing me to jot down my ideas in their raw, unfiltered form. It's a space where I can freely express the intricacies of my thoughts without the pressure of immediate coherence.
It’s never a perfect capture, but it doesn’t need to be; here the urgency is getting most of the idea out.
I see this as a massive win for accessibility, because I’d previously fill (and then lose) notebooks with rambling disconnected thoughts, like some kind of tortured creative letting out creative energy before it turns destructive.
It’s one of the reasons I learned to touch type so young, and it’s the reason I can hold a conversation whilst typing notes on a different subject… neither nature nor nurture, but necessity.
Microsoft 365 Copilot: The Translator
Once the ideas captured, I turn to Microsoft 365 Copilot to refine these ideas into a "human-readable" format. It's like having a personal translator who understands the language of my mind and can articulate it in a way that resonates with others.
A Vision for Proactive Health Monitoring
So in this example, which I recorded a few days ago, I explored the potential of continuous glucose monitors (CGMs) from Dexcom for proactive health diagnostics.
Without any medical training but with the hesitant endorsement of GP friends, I hypothesized the early detection of diabetes. This could alleviate pressure on the NHS and reduce the financial burden of diagnosis and chronic illness management.
I envisioned a world where affordable consumer access to CGMs could identify undiagnosed diabetics and prediabetics for approximately £50 per person. This could significantly lower the £2000 average cost of diabetes diagnosis and the subsequent annual expenses for inpatient care and consumables.
The Home OGTT Experiment
Inspired by the home oral glucose tolerance test (OGTT) often conducted during pregnancy, I considered the possibility of a similar approach using jelly babies. This could also extend to a rudimentary lactose intolerance test.
Integrating with Digital Health Records
The raw data from these tests could be seamlessly integrated into NHS Digital's Personal Health Record API and private practice systems like Semble, enhancing patient advocacy through platforms like Patients Know Best.
Beyond Data Sharing: Proactive Alerts
The system could evolve to not only share data but also proactively alert users to potential health concerns by flagging abnormal test results, assuming a non-diabetic baseline.
The Limitations of My Medical Expertise
While my medical knowledge is limited, my curiosity isn't. I've toyed with the idea of self-experimentation and even developing a proof-of-concept app for preliminary data interpretation. With a long-haul flight ahead, I know exactly how I'll be spending those hours.
The main event
Here’s the 3:30 min video demo. Things I’ve learned are:
I shouldn’t riff a piece to camera
I look like a Viking, and need a hair cut
I talk much faster when I’m recording!
Something I forgot to mention is that I don’t exclusively go to the beach to catch my thoughts. Sometimes I’ll drop the phone onto transcribe, put it in my pocket, and chat to it whilst walking to work.
On the CGM idea…
I invite GPs and endocrinologists to weigh in on my idea. Is it a step in the right direction, or am I missing crucial medical insights?
On the Power of Voice Notes and Copilot
Do you find the use of Voice Notes and Copilot an effective way to capture and refine your ideas?
I'd love to hear your thoughts, comment below, or over on LinkedIn.
UPDATE:
I’ve been given a Dexcom G7 CGM! It’s in my arm and we’re getting data…
I’ll spend a some time pulling the data processor together and probably see if I can test one of those OGTT (oral glucose tolerance tests) using Jellybabies (yes that’s real!).
I stuck the CGM on fully shortly after this video, and applied the overlay too, once I was shown how.
I should remind you that this is only loosely medically supervised - I’ve got people available to run these ideas past, and I’ve got people who will help me to interpret the data. If you’re going to try something similar, make sure you’ve got the right safeties in place.
Update 16th May 24:
I'd put the CGM into my arm last week, and on 6th May I ran a basic OGTT (oral glucose tolerance test).
Using that data, here's the proof of concept that I'd envisaged.
It's Javascript, and runs entirely client side (e.g. no data leaves the computer where the script runs). What happening in the screenshot:
The Dexcom G7 CGM is writing data to Apple Health (other CGMs are available)
I wrote a simple Shortcut to extract that data to a Note (and then a CSV), which is plotted onto a plot.ly chart
I tracked Carbs over the time period using MyFitnessPal, and because there's less than a dozen rows, I transcribed those numbers to a CSV, but I have scalability for next time if I wanted to use a full years' data
I pulled the OGTT diagnostic ranges from NICE, and overlaid those onto the chart, where green = normal read, amber = potential pre-diabetes, red = potential diabetes
The script calculates the appropriate glucose range for the "fasted", "preprandial" and "post prandial" time periods around each Carb intake, and builds a time-contexted RAG marker - there's different ranges for wake-up vs post-prandial (after eating).
Now the chart isn't perfect - I've got a preprandial period that runs in the wrong direction (it starts at "carbs ingested" rather than ends there, because it obscured part of the graph), and there are different Glucose bandings for pregnancy, children, T1, T2 diabetes, and there's a whole bunch of reasons why this wouldn't be right to roll out in its current form. It also shows invalid periods of time, like where carbs are consumed too closely to each other, and whilst those periods are greyed-over, I chose not to suppress the time-segments during those "dont-read" periods, because I wanted to be able to expose the calculation, even when the calculation wasn't technically valid at that point in time.
What I can see here though, is with the emergence of more IOT medical devices, and the advancements in technology, some of these insights can / could be moved into the hands of the general population, and could hopefully offload some effort from our medical system.
Its been a pretty interesting thought experiment, and was a decent way to spend my time in the air. I could see this being useful if validated, though I could see a bunch of disbenefits too (patient panic, misinterpretation, false-negative, partial diagnostics, adding pressure on supply of CGMs, interpretations without medical training, etc). The ethical considerations around that would really need to be thought through.
Here's the code for anyone who wants a mooch:
Usage:
Capture blood sugar into a CSV containing no more than 1 reading per 30 seconds.
The CSV must be formatted as: time in ISO8601 format, and glucose reading in mmol/l
Hit "Choose File" in the blood readings section, and "upload" that CSV, to generate a graph of blood sugar over time.
Capture Carbs intake into a CSV
The CSV must be formatted as: time in ISO8601 format, and Carb intake in grams
Hit "Choose File" in the Carb intake tile, and then "Go" that CSV, to overlay your existing chart with time segments where wake-time and post-prandial are RAG-boundaried by "normal reading", "potential pre-diabetic reading", and "potential diabetic reading".
Caution: do not use this for diagnostic. Do not use this for insulin decision making. This is experimental code and is not clinically validated. This code is for educational and experimentational use only. There are a lot of disclaimers here - for good reason.
Sample Files
Here's some content for sample files, just expand the filename, copy and paste the numbers into a Notepad, save-as bloodSugar.csv or carbIntake.csv (or save as whatever you want), and then pull those files into the model to see it working.
bloodSugar.csv
carbIntake.csv
Datapoint extract from Apple Health
Example Apple Shortcut
Update 17th May 24
Some of you contacted me saying you'd had difficulty creating the demo files... I'd only really been thinking about people reviewing the code rather than really interacting with it, but I see my error!
On that basis, I've just updated the code so that if you hit "Go" on the blood reads and then carb reads, without choosing any file to use, it'll load some sample data. 👍
Hope you enjoyed it!
R
[update section]
Comments