Compare commits

..

18 commits
ai-code ... owo

Author SHA1 Message Date
e7bb41d8ab Claude: Output summary as csv with csv-stringify 2026-03-04 13:54:22 +00:00
f0df9bccf7 Claude: add summary.ts 2026-03-04 12:20:11 +00:00
312e499fac Use custom snapshots instead of nodejs's to handle carriage returns and better diffing/flow of running tests 2026-03-04 06:00:01 -05:00
d14f1ec778 Claude: Add fitbit export, though tests are broken due to snapshot oddities 2026-03-03 07:39:50 +00:00
c84d901dd5 Added fitbit test fixture 2026-03-02 08:42:47 -05:00
6fd859e057 Scrub now handles CSVs (in Typescript), scrub now has tests (for both TS and jq scrubbing), scrub has many more cases 2026-03-02 01:54:20 -05:00
1665069593 Use Nodejs's native implementation of globSync(), colocate all zipFs weirdness in it's own file 2026-03-01 15:48:41 -05:00
4a50ed5d74 Consolidate all separate export tests into single file + single test strategy 2026-03-01 13:20:10 -05:00
06d5dec7e7 Claude: added 3 new exporters + tests 2026-02-28 02:30:39 +00:00
c093fbfcee Added 3 more scrubbed fixtures for new exports, scrub added boolean and numeric key scrubbing 2026-02-27 03:39:42 -05:00
7d815833e6 Rewrote timelinize.ts to work, and added new features for it. Added aggregateColumns for aggregated header, added metaIdValue to track which aggregate has metadata for another TaskTarget, added each() to allow a method of using cmd() with .id and other properties, added execPaths to make the initial definition of TaskTarget array to be a little more succinct 2026-02-26 16:32:33 -05:00
a4fbe1618d Fixed FB dating messages, added metadata as output table, added aggregate message thread metadata from FB
* aggregateId is now metadata and it's just aggregate: boolean and uses .id instead
* Use csv-parse for tests
* Update test snapshots
2026-02-26 11:21:36 -05:00
f6d0427a45 Converted TaskTargetPipelineHelper to more functional style, added aggregate() functionality to bring together multiple exports (no tests, but works)
* made parallel generic (not tied to TaskTarget)
* pulled common higher-order/frontend operations into io.ts
* split timelinize specific functionality into own file
* Tests made to pass and match previous facebook export snapshots _exactly_
2026-02-26 00:14:10 -05:00
9c3bdaa100 Broken AI code adding more columns 2026-02-24 03:50:15 +00:00
845ceb4c84 Change up main for timelinize output 2026-02-22 11:52:42 -05:00
80be7de844 Claude: annotate all facebook tables with metadata and description preliminarily 2026-02-22 13:33:35 +00:00
3e64969e05 Clean up API, make columnMeta a bit more specific 2026-02-22 08:22:38 -05:00
f6b0f02de7 setId(), types(), csvSink() become assignMeta(), clean up unused task.ts stuff 2026-02-22 05:23:17 -05:00
525 changed files with 10365 additions and 1311 deletions

8
.gitignore vendored
View file

@ -1,7 +1,5 @@
node_modules/ node_modules/
your.db *.db
data-export/oldfacebook.ts your.csv
OUTTEST
.gitSAFE .gitSAFE
out.manifest *.DELETE-THIS-HAS-PII
test.manifest

25
README.md Normal file
View file

@ -0,0 +1,25 @@
# base-data-manager
A Typescript project for parsing through many types of data exports to tabular formats
** This is heavily WIP, and mostly just a toy for myself **
### Installation
* Install `jq`
* Install sqlite `csv.so` extension (Hardcoded to `/home/cobertos/sqlite-files/` currently)
* Install `node` + `pnpm i`
* See `main.ts` for current example usage
### Proposed Architecture
The architecture runs in 2 steps.
The first step is unopinionated in it's output format. It's meant to take the source data exactly as-is and output it as csv. All source data should pass through, but will be normalized in csv
**TODO: It's not completely unopinionated, there is some normalization for names of columns I think we want to apply? Or maybe we apply that later...**
An optional second step combines everything into a single SQLite database. From here we normalize many different types of data across multiple exports into a single opinionated output. For example, message threads/channels should all have the same table format, or end up in the same table
**TODO: No idea if the second part should be a part of this project... but it currently is**

View file

@ -0,0 +1,69 @@
import { pipe, each, cmd, assignMeta, glob, read, branchGen, type PipelineOp } from "./task.ts";
/**
* Extracts the channel ID from the filename, e.g.
* "GuildName - Text Channels - ChannelName [0000000000000000].json" "0000000000000000"
*/
function chatExporterChannelId(t: { path: string }): string {
const match = t.path.match(/\[([^\]]+)\]\.json$/);
return match?.[1] ?? t.path.split('/').pop()!;
}
/**
* Channel metadata aggregate + messages table, one pair per exported JSON file.
* Unlike the native Discord export, DiscordChatExporter captures ALL authors' messages.
*/
function discord_chat_exporter_messages(): PipelineOp {
return branchGen(function* () {
// Channel-level metadata aggregated into a single table
yield pipe(
glob(`*.json`),
assignMeta({ idValue: t => `DiscordCE - Channel ${chatExporterChannelId(t)}` }),
read(),
each(t => t.clone().cmd(["jq", "-r", `
["${t.id}", .guild.name, .channel.name, .channel.type, (.channel.category // ""), (.channel.topic // ""), .messageCount]
| @csv
`])),
assignMeta({
aggregate: true,
aggregateColumns: ["id", "guild_name", "channel_name", "channel_type", "channel_category", "channel_topic", "message_count"],
idValue: "DiscordCE - Messages Meta",
})
);
// The messages — one table per exported file
yield pipe(
glob(`*.json`),
assignMeta({ idValue: t => `DiscordCE - Messages ${chatExporterChannelId(t)}` }),
read(),
cmd(["jq", "-r", `
["id", "timestamp", "author", "discriminator", "content", "attachment"],
(
.messages[]
| [
.id,
.timestamp,
.author.name,
(.author.discriminator // ""),
.content,
(.attachments[0].url // "")
]
)
| @csv
`]),
assignMeta({
metaIdValue: "DiscordCE - Messages Meta",
columnMeta: ["any", "isodatetime", "sender", "any", "text", "url"],
perRowDescription: '"{4}" from {2} at {1}',
perRowTags: "discord,message",
})
);
});
}
export function discord_chat_exporter(): PipelineOp {
return pipe(
assignMeta({ idValue: t => `DiscordCE - ${t.basename}` }),
discord_chat_exporter_messages()
);
}

201
data-export/discord.ts Normal file
View file

@ -0,0 +1,201 @@
import { pipe, each, cmd, assignMeta, cd, glob, read, branchGen, type PipelineOp } from "./task.ts";
/** Extracts the channel ID directory name from paths like messages/{channelId}/messages.csv */
function discordChannelId(t: { path: string }): string {
return t.path.split('/').slice(-2, -1)[0];
}
/** Linked third-party accounts (Steam, Twitch, etc.) from account/user.json */
function discord_connections(): PipelineOp {
return pipe(
cmd(["jq", "-r", `
["type", "name", "id", "verified", "visibility"],
(
.connections[]?
| [.type, .name, .id, .verified, .visibility]
)
| @csv
`]),
assignMeta({
idValue: "Discord - Connections",
columnMeta: ["text", "text", "any", "any", "any"],
perRowDescription: '{0} account "{1}"',
perRowTags: "discord",
})
);
}
/** Friends, blocked users, and other relationships from account/user.json */
function discord_relationships(): PipelineOp {
return pipe(
cmd(["jq", "-r", `
["username", "discriminator", "type"],
(
.relationships[]?
| [.user.username, .user.discriminator, .type]
)
| @csv
`]),
assignMeta({
idValue: "Discord - Relationships",
columnMeta: ["text", "any", "any"],
perRowDescription: '{0}#{1} (relationship type {2})',
perRowTags: "discord",
})
);
}
/** Purchase history from account/user.json */
function discord_payments(): PipelineOp {
return pipe(
cmd(["jq", "-r", `
["created_at", "description", "amount", "currency", "status"],
(
.payments[]?
| [.created_at, .description, .amount, .currency, .status]
)
| @csv
`]),
assignMeta({
idValue: "Discord - Payments",
columnMeta: ["isodatetime", "text", "numeric", "text", "any"],
perRowDescription: '{1}: {2} {3} on {0}',
perRowTags: "discord,payment",
})
);
}
/** Application/game play-time statistics from account/user.json */
function discord_activity_stats(): PipelineOp {
return pipe(
cmd(["jq", "-r", `
["application_id", "last_played_at", "total_duration"],
(
.user_activity_application_statistics[]?
| [.application_id, .last_played_at, .total_duration]
)
| @csv
`]),
assignMeta({
idValue: "Discord - Activity Stats",
columnMeta: ["any", "isodatetime", "numeric"],
perRowDescription: 'App {0}: {2}s played, last at {1}',
perRowTags: "discord",
})
);
}
/**
* Activity event logs from activity/{subdir}/events-*.json (NDJSON format).
* Each subdirectory (analytics, modeling, reporting, tns) becomes its own table.
* Fields chosen for what the user did: event type, when, where (channel/guild),
* which message, which game, and human-readable channel/guild names when available.
*/
function discord_activity_events(): PipelineOp {
return pipe(
glob(`activity/*/events-*.json`),
assignMeta({ idValue: t => `Discord - Activity ${t.path.split('/').slice(-2, -1)[0]}` }),
read(),
// NDJSON: use -n + inputs so jq processes all lines, emitting one header then N rows
cmd(["jq", "-rn", `
["event_type", "timestamp", "channel_id", "guild_id", "message_id", "game_name", "channel_name", "guild_name"],
(
inputs
| [
.event_type,
.timestamp,
(.channel_id // ""),
(.guild_id // ""),
(.message_id // ""),
(.game_name // ""),
(.channel_name // ""),
(.guild_name // "")
]
)
| @csv
`]),
assignMeta({
columnMeta: ["text", "isodatetime", "any", "any", "any", "text", "text", "text"],
perRowDescription: '{0} at {1}',
perRowTags: "discord,activity",
})
);
}
/** Notes the user wrote on other users, keyed by user ID, from account/user.json */
function discord_notes(): PipelineOp {
return pipe(
cmd(["jq", "-r", `
["user_id", "note"],
(
.notes // {}
| to_entries[]
| [.key, .value]
)
| @csv
`]),
assignMeta({
idValue: "Discord - Notes",
columnMeta: ["any", "text"],
perRowDescription: 'Note on {0}: "{1}"',
perRowTags: "discord",
})
);
}
/**
* Messages from messages/{channelId}/messages.csv and channel metadata from
* messages/{channelId}/channel.json.
* NOTE: The export only contains the exporting user's own messages.
*/
function discord_messages(): PipelineOp {
return branchGen(function* () {
// Channel-level metadata aggregated into a single table
yield pipe(
glob(`messages/*/channel.json`),
assignMeta({ idValue: t => `Discord - Channel ${discordChannelId(t)}` }),
read(),
each(t => t.clone().cmd(["jq", "-r", `
["${t.id}", .type, (.name // ""), (.guild.id // ""), (.guild.name // ""), ((.recipients // []) | join(","))]
| @csv
`])),
assignMeta({
aggregate: true,
aggregateColumns: ["id", "type", "name", "guild_id", "guild_name", "recipients"],
idValue: "Discord - Messages Meta",
})
);
// The messages themselves — one table per channel
yield pipe(
glob(`messages/*/messages.csv`),
assignMeta({ idValue: t => `Discord - Messages ${discordChannelId(t)}` }),
read(),
// Normalize the header row to lowercase names
cmd(["sed", "-e", "1s/.*/id,timestamp,content,attachment/"]),
assignMeta({
metaIdValue: "Discord - Messages Meta",
columnMeta: ["any", "isodatetime", "text", "url"],
perRowDescription: '"{2}" at {1}',
perRowTags: "discord,message,content_by_me",
})
);
});
}
export function discord(): PipelineOp {
return pipe(
assignMeta({ idValue: t => `Discord - ${t.basename}` }),
branchGen(function* () {
yield discord_messages();
yield pipe(cd(`account/user.json`), read(), discord_connections());
yield pipe(cd(`account/user.json`), read(), discord_relationships());
yield pipe(cd(`account/user.json`), read(), discord_payments());
yield pipe(cd(`account/user.json`), read(), discord_activity_stats());
yield pipe(cd(`account/user.json`), read(), discord_notes());
yield discord_activity_events();
})
);
}

File diff suppressed because it is too large Load diff

747
data-export/fitbit.ts Normal file
View file

@ -0,0 +1,747 @@
import { pipe, cmd, assignMeta, cd, glob, read, branchGen, branch, type PipelineOp } from "./task.ts";
/** Single CSV passthrough — cd to file, read, set id */
function csvOne(filePath: string, id: string, tags = "fitbit"): PipelineOp {
return pipe(
cd(filePath), read(),
assignMeta({ idValue: id, perRowTags: tags })
);
}
/**
* Aggregate multiple date/number-suffixed CSV files into one table.
* strips each file's own header row with `tail -n +2`;
* the framework emits `aggregateColumns` as the single header.
*/
function csvMany(pattern: string, id: string, columns: string[], tags = "fitbit"): PipelineOp {
return pipe(
glob(pattern),
assignMeta({ idValue: id, aggregate: true, aggregateColumns: columns, perRowTags: tags }),
read(),
cmd(["tail", "-n", "+2"])
);
}
/**
* Aggregate multiple date/number-suffixed JSON array files into one table.
* `jqBody` should output CSV rows only (no header line); the framework
* emits `aggregateColumns` as the single header.
*/
function jsonMany(pattern: string, id: string, columns: string[], jqBody: string, tags = "fitbit"): PipelineOp {
return pipe(
glob(pattern),
assignMeta({ idValue: id, aggregate: true, aggregateColumns: columns, perRowTags: tags }),
read(),
cmd(["jq", "-r", jqBody])
);
}
// ============================================================
// Application
// ============================================================
function fitbit_account_access_events(): PipelineOp {
// Files are paginated: Account_Access_Events_1.csv, Account_Access_Events_2.csv, ...
return csvMany(
"Application/Account_Access_Events_*.csv",
"Fitbit - Account Access Events",
["timestamp", "event_name", "email", "location", "ip", "outcome", "reason", "application", "device_info"]
);
}
function fitbit_account_management_events(): PipelineOp {
return csvMany(
"Application/Account_Management_Events_*.csv",
"Fitbit - Account Management Events",
["timestamp", "event_name", "email", "location", "ip", "outcome", "reason"]
);
}
function fitbit_email_audit(): PipelineOp {
return csvOne("Application/User_Email_Audit_Entry.csv", "Fitbit - Email Audit");
}
function fitbit_retired_passwords(): PipelineOp {
return csvOne("Application/User_Retired_Password.csv", "Fitbit - Retired Passwords");
}
// ============================================================
// Heart
// ============================================================
function fitbit_afib_enrollment(): PipelineOp {
return csvOne("Heart/afib_ppg_enrollment.csv", "Fitbit - AFib Enrollment");
}
function fitbit_hr_notification_alerts(): PipelineOp {
return csvOne("Heart/Heart Rate Notifications Alerts.csv", "Fitbit - HR Notification Alerts");
}
function fitbit_hr_notification_profile(): PipelineOp {
return csvOne("Heart/Heart Rate Notifications Profile.csv", "Fitbit - HR Notification Profile");
}
// ============================================================
// Menstrual Health
// ============================================================
function fitbit_menstrual_cycles(): PipelineOp {
return csvOne("Menstrual Health/menstrual_health_cycles.csv", "Fitbit - Menstrual Cycles");
}
function fitbit_menstrual_symptoms(): PipelineOp {
return csvOne("Menstrual Health/menstrual_health_symptoms.csv", "Fitbit - Menstrual Symptoms");
}
function fitbit_menstrual_birth_control(): PipelineOp {
return csvOne("Menstrual Health/menstrual_health_birth_control.csv", "Fitbit - Menstrual Birth Control");
}
function fitbit_menstrual_settings(): PipelineOp {
return csvOne("Menstrual Health/menstrual_health_settings.csv", "Fitbit - Menstrual Settings");
}
// ============================================================
// Other
// ============================================================
function fitbit_oxygen_variation(): PipelineOp {
return csvMany(
"Other/estimated_oxygen_variation-*.csv",
"Fitbit - Estimated Oxygen Variation",
["timestamp", "Infrared to Red Signal Ratio"]
);
}
// ============================================================
// Personal & Account
// ============================================================
function fitbit_profile(): PipelineOp {
return csvOne("Personal & Account/Profile.csv", "Fitbit - Profile");
}
function fitbit_devices(): PipelineOp {
return csvOne("Personal & Account/Devices.csv", "Fitbit - Devices");
}
function fitbit_trackers(): PipelineOp {
return csvOne("Personal & Account/Trackers.csv", "Fitbit - Trackers");
}
function fitbit_scales(): PipelineOp {
return csvOne("Personal & Account/Scales.csv", "Fitbit - Scales");
}
function fitbit_tracker_config(): PipelineOp {
return csvOne("Personal & Account/Tracker Optional Configuration.csv", "Fitbit - Tracker Optional Configuration");
}
function fitbit_ios_notifications(): PipelineOp {
return csvOne("Personal & Account/iOS App Notification Settings.csv", "Fitbit - iOS App Notification Settings");
}
function fitbit_height(): PipelineOp {
return jsonMany(
"Personal & Account/height-*.json",
"Fitbit - Height",
["dateTime", "value"],
`.[] | [.dateTime, .value] | @csv`
);
}
function fitbit_weight(): PipelineOp {
return jsonMany(
"Personal & Account/weight-*.json",
"Fitbit - Weight",
["logId", "weight", "bmi", "date", "time", "source"],
`.[] | [.logId, .weight, .bmi, .date, .time, .source] | @csv`
);
}
// ============================================================
// Physical Activity
// ============================================================
function fitbit_active_zone_minutes(): PipelineOp {
return csvMany(
"Physical Activity/Active Zone Minutes - *.csv",
"Fitbit - Active Zone Minutes",
["date_time", "heart_zone_id", "total_minutes"]
);
}
function fitbit_activity_goals(): PipelineOp {
return csvOne("Physical Activity/Activity Goals.csv", "Fitbit - Activity Goals");
}
function fitbit_daily_readiness(): PipelineOp {
return csvMany(
"Physical Activity/Daily Readiness User Properties - *.csv",
"Fitbit - Daily Readiness User Properties",
["property_type", "value", "last_update"]
);
}
function fitbit_calories(): PipelineOp {
return jsonMany(
"Physical Activity/calories-*.json",
"Fitbit - Calories",
["dateTime", "value"],
`.[] | [.dateTime, .value] | @csv`
);
}
function fitbit_vo2_max(): PipelineOp {
return jsonMany(
"Physical Activity/demographic_vo2_max-*.json",
"Fitbit - Demographic VO2 Max",
["dateTime", "demographicVO2Max", "demographicVO2MaxError", "filteredDemographicVO2Max", "filteredDemographicVO2MaxError"],
`.[] | [.dateTime, .value.demographicVO2Max, .value.demographicVO2MaxError, .value.filteredDemographicVO2Max, .value.filteredDemographicVO2MaxError] | @csv`
);
}
function fitbit_distance(): PipelineOp {
return jsonMany(
"Physical Activity/distance-*.json",
"Fitbit - Distance",
["dateTime", "value"],
`.[] | [.dateTime, .value] | @csv`
);
}
function fitbit_exercises(): PipelineOp {
return jsonMany(
"Physical Activity/exercise-*.json",
"Fitbit - Exercises",
["logId", "activityName", "activityTypeId", "averageHeartRate", "calories", "duration", "activeDuration", "steps", "logType", "startTime", "hasGps"],
`.[] | [.logId, .activityName, .activityTypeId, (.averageHeartRate // ""), .calories, .duration, .activeDuration, (.steps // ""), .logType, .startTime, .hasGps] | @csv`
);
}
function fitbit_heart_rate(): PipelineOp {
return jsonMany(
"Physical Activity/heart_rate-*.json",
"Fitbit - Heart Rate",
["dateTime", "bpm", "confidence"],
`.[] | [.dateTime, .value.bpm, .value.confidence] | @csv`
);
}
function fitbit_lightly_active_minutes(): PipelineOp {
return jsonMany(
"Physical Activity/lightly_active_minutes-*.json",
"Fitbit - Lightly Active Minutes",
["dateTime", "value"],
`.[] | [.dateTime, .value] | @csv`
);
}
function fitbit_moderately_active_minutes(): PipelineOp {
return jsonMany(
"Physical Activity/moderately_active_minutes-*.json",
"Fitbit - Moderately Active Minutes",
["dateTime", "value"],
`.[] | [.dateTime, .value] | @csv`
);
}
function fitbit_resting_heart_rate(): PipelineOp {
// Some entries have null value; filter those out
return jsonMany(
"Physical Activity/resting_heart_rate-*.json",
"Fitbit - Resting Heart Rate",
["dateTime", "value", "error"],
`.[] | select(.value != null and .value.value != null) | [.dateTime, .value.value, .value.error] | @csv`
);
}
function fitbit_sedentary_minutes(): PipelineOp {
return jsonMany(
"Physical Activity/sedentary_minutes-*.json",
"Fitbit - Sedentary Minutes",
["dateTime", "value"],
`.[] | [.dateTime, .value] | @csv`
);
}
function fitbit_steps(): PipelineOp {
return jsonMany(
"Physical Activity/steps-*.json",
"Fitbit - Steps",
["dateTime", "value"],
`.[] | [.dateTime, .value] | @csv`
);
}
function fitbit_swim_lengths(): PipelineOp {
return jsonMany(
"Physical Activity/swim_lengths_data-*.json",
"Fitbit - Swim Lengths",
["dateTime", "lapDurationSec", "strokeCount", "swimStrokeType", "swimAlgorithmType"],
`.[] | [.dateTime, .value.lapDurationSec, .value.strokeCount, .value.swimStrokeType, .value.swimAlgorithmType] | @csv`
);
}
function fitbit_time_in_hr_zones(): PipelineOp {
return jsonMany(
"Physical Activity/time_in_heart_rate_zones-*.json",
"Fitbit - Time in Heart Rate Zones",
["dateTime", "BELOW_DEFAULT_ZONE_1", "IN_DEFAULT_ZONE_1", "IN_DEFAULT_ZONE_2", "IN_DEFAULT_ZONE_3"],
`.[] | [.dateTime, (.value.valuesInZones.BELOW_DEFAULT_ZONE_1 // ""), (.value.valuesInZones.IN_DEFAULT_ZONE_1 // ""), (.value.valuesInZones.IN_DEFAULT_ZONE_2 // ""), (.value.valuesInZones.IN_DEFAULT_ZONE_3 // "")] | @csv`
);
}
function fitbit_very_active_minutes(): PipelineOp {
return jsonMany(
"Physical Activity/very_active_minutes-*.json",
"Fitbit - Very Active Minutes",
["dateTime", "value"],
`.[] | [.dateTime, .value] | @csv`
);
}
// ============================================================
// Sleep
// ============================================================
function fitbit_sleep(): PipelineOp {
return jsonMany(
"Sleep/sleep-*.json",
"Fitbit - Sleep",
["logId", "dateOfSleep", "startTime", "endTime", "duration", "minutesToFallAsleep", "minutesAsleep", "minutesAwake", "minutesAfterWakeup", "timeInBed", "efficiency", "type", "infoCode", "logType", "mainSleep", "deepMinutes", "wakeMinutes", "lightMinutes", "remMinutes"],
`.[] | [.logId, .dateOfSleep, .startTime, .endTime, .duration, .minutesToFallAsleep, .minutesAsleep, .minutesAwake, .minutesAfterWakeup, .timeInBed, .efficiency, .type, .infoCode, .logType, (.mainSleep | tostring), (.levels.summary.deep.minutes // ""), (.levels.summary.wake.minutes // ""), (.levels.summary.light.minutes // ""), (.levels.summary.rem.minutes // "")] | @csv`
);
}
function fitbit_sleep_score(): PipelineOp {
return csvOne("Sleep/sleep_score.csv", "Fitbit - Sleep Score");
}
function fitbit_daily_spo2(): PipelineOp {
return csvMany(
"Sleep/Daily SpO2 - *.csv",
"Fitbit - Daily SpO2",
["timestamp", "average_value", "lower_bound", "upper_bound"]
);
}
function fitbit_minute_spo2(): PipelineOp {
return csvMany(
"Sleep/Minute SpO2 - *.csv",
"Fitbit - Minute SpO2",
["timestamp", "value"]
);
}
function fitbit_device_temperature(): PipelineOp {
return csvMany(
"Sleep/Device Temperature - *.csv",
"Fitbit - Device Temperature",
["recorded_time", "temperature", "sensor_type"]
);
}
// ============================================================
// Social
// ============================================================
function fitbit_badges(): PipelineOp {
return pipe(
cd("Social/badge.json"), read(),
cmd(["jq", "-r", `
["encodedId", "badgeType", "value", "timesAchieved", "dateTime", "name", "shortName", "category"],
(.[] | [.encodedId, .badgeType, .value, .timesAchieved, .dateTime, .name, .shortName, .category])
| @csv
`]),
assignMeta({
idValue: "Fitbit - Badges",
columnMeta: ["any", "text", "numeric", "numeric", "isodatetime", "text", "text", "text"],
perRowTags: "fitbit",
})
);
}
// ============================================================
// Stress
// ============================================================
function fitbit_stress_score(): PipelineOp {
return csvOne("Stress/Stress Score.csv", "Fitbit - Stress Score");
}
// ============================================================
// Google Data / Health Fitness Data
// ============================================================
function fitbit_google_calibration(): PipelineOp {
return csvOne("Google Data/Health Fitness Data/CalibrationStatusForReadinessAndLoad.csv", "Fitbit - Google Calibration Status");
}
function fitbit_google_goal_settings(): PipelineOp {
return csvOne("Google Data/Health Fitness Data/GoalSettingsHistory.csv", "Fitbit - Google Goal Settings History");
}
function fitbit_google_irn_state(): PipelineOp {
return csvOne("Google Data/Health Fitness Data/TakeoutIrnUserState.csv", "Fitbit - Google IRN User State");
}
function fitbit_google_app_settings(): PipelineOp {
return csvOne("Google Data/Health Fitness Data/UserAppSettingData.csv", "Fitbit - Google App Setting Data");
}
function fitbit_google_demographic(): PipelineOp {
return csvOne("Google Data/Health Fitness Data/UserDemographicData.csv", "Fitbit - Google Demographic Data");
}
function fitbit_google_legacy_settings(): PipelineOp {
return csvOne("Google Data/Health Fitness Data/UserLegacySettingData.csv", "Fitbit - Google Legacy Setting Data");
}
function fitbit_google_mbd(): PipelineOp {
return csvOne("Google Data/Health Fitness Data/UserMBDData.csv", "Fitbit - Google MBD Data");
}
function fitbit_google_profile(): PipelineOp {
return csvOne("Google Data/Health Fitness Data/UserProfileData.csv", "Fitbit - Google Profile Data");
}
function fitbit_google_exercises(): PipelineOp {
// Extension-less date-suffixed files (CSVs without .csv extension)
return csvMany(
"Google Data/Health Fitness Data/UserExercises_*",
"Fitbit - Google Exercises",
["exercise_id", "exercise_start", "exercise_end", "utc_offset", "exercise_created", "exercise_last_updated", "activity_name", "log_type", "pool_length", "pool_length_unit", "intervals", "distance_units", "tracker_total_calories", "tracker_total_steps", "tracker_total_distance_mm", "tracker_total_altitude_mm", "tracker_avg_heart_rate", "tracker_peak_heart_rate", "tracker_avg_pace_mm_per_second", "tracker_avg_speed_mm_per_second", "tracker_peak_speed_mm_per_second", "tracker_auto_stride_run_mm", "tracker_auto_stride_walk_mm", "tracker_swim_lengths", "tracker_pool_length", "tracker_pool_length_unit", "tracker_cardio_load", "manually_logged_total_calories", "manually_logged_total_steps", "manually_logged_total_distance_mm", "manually_logged_pool_length", "manually_logged_pool_length_unit", "events", "activity_type_probabilities", "autodetected_confirmed", "autodetected_start_timestamp", "autodetected_end_timestamp", "autodetected_utc_offset", "autodetected_activity_name", "autodetected_sensor_based_activity_name", "deletion_reason", "activity_label", "suggested_start_timestamp", "suggested_end_timestamp", "reconciliation_status"]
);
}
function fitbit_google_sleep_scores(): PipelineOp {
return csvMany(
"Google Data/Health Fitness Data/UserSleepScores_*",
"Fitbit - Google Sleep Scores",
["sleep_id", "sleep_score_id", "data_source", "score_utc_offset", "score_time", "overall_score", "duration_score", "composition_score", "revitalization_score", "sleep_time_minutes", "deep_sleep_minutes", "rem_sleep_percent", "resting_heart_rate", "sleep_goal_minutes", "waso_count_long_wakes", "waso_count_all_wake_time", "restlessness_normalized", "hr_below_resting_hr", "sleep_score_created", "sleep_score_last_updated"]
);
}
function fitbit_google_sleep_stages(): PipelineOp {
return csvMany(
"Google Data/Health Fitness Data/UserSleepStages_*",
"Fitbit - Google Sleep Stages",
["sleep_id", "sleep_stage_id", "sleep_stage_type", "start_utc_offset", "sleep_stage_start", "end_utc_offset", "sleep_stage_end", "data_source", "sleep_stage_created", "sleep_stage_last_updated"]
);
}
function fitbit_google_sleeps(): PipelineOp {
return csvMany(
"Google Data/Health Fitness Data/UserSleeps_*",
"Fitbit - Google Sleeps",
["sleep_id", "sleep_type", "minutes_in_sleep_period", "minutes_after_wake_up", "minutes_to_fall_asleep", "minutes_asleep", "minutes_awake", "minutes_longest_awakening", "minutes_to_persistent_sleep", "start_utc_offset", "sleep_start", "end_utc_offset", "sleep_end", "data_source", "sleep_created", "sleep_last_updated"]
);
}
// ============================================================
// Google Data / Physical Activity
// ============================================================
function fitbit_google_active_minutes(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/active_minutes_[0-9]*.csv",
"Fitbit - Google Active Minutes",
["timestamp", "light", "moderate", "very", "data source"]
);
}
function fitbit_google_active_zone_minutes(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/active_zone_minutes_[0-9]*.csv",
"Fitbit - Google Active Zone Minutes",
["timestamp", "heart rate zone", "total minutes", "data source"]
);
}
function fitbit_google_activity_level(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/activity_level_[0-9]*.csv",
"Fitbit - Google Activity Level",
["timestamp", "level", "data source"]
);
}
function fitbit_google_body_temperature(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/body_temperature_[0-9]*.csv",
"Fitbit - Google Body Temperature",
["timestamp", "temperature celsius", "data source"]
);
}
function fitbit_google_calories(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/calories_[0-9]*.csv",
"Fitbit - Google Calories",
["timestamp", "calories", "data source"]
);
}
function fitbit_google_calories_in_hr_zone(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/calories_in_heart_rate_zone_[0-9]*.csv",
"Fitbit - Google Calories in HR Zone",
["timestamp", "heart rate zone type", "kcal", "data source"]
);
}
function fitbit_google_cardio_load(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/cardio_load_[0-9]*.csv",
"Fitbit - Google Cardio Load",
["timestamp", "workout", "background", "total", "data source"]
);
}
function fitbit_google_daily_oxygen_saturation(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/daily_oxygen_saturation_[0-9]*.csv",
"Fitbit - Google Daily Oxygen Saturation",
["timestamp", "average percentage", "lower bound percentage", "upper bound percentage", "baseline percentage", "standard deviation percentage", "data source"]
);
}
function fitbit_google_distance(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/distance_[0-9]*.csv",
"Fitbit - Google Distance",
["timestamp", "distance", "data source"]
);
}
function fitbit_google_heart_rate(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/heart_rate_[0-9]*.csv",
"Fitbit - Google Heart Rate",
["timestamp", "beats per minute", "data source"]
);
}
function fitbit_google_heart_rate_variability(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/heart_rate_variability_[0-9]*.csv",
"Fitbit - Google Heart Rate Variability",
["timestamp", "root mean square of successive differences milliseconds", "standard deviation milliseconds", "data source"]
);
}
function fitbit_google_live_pace(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/live_pace_[0-9]*.csv",
"Fitbit - Google Live Pace",
["timestamp", "steps", "distance millimeters", "altitude gain millimeters", "data source"]
);
}
function fitbit_google_oxygen_saturation(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/oxygen_saturation_[0-9]*.csv",
"Fitbit - Google Oxygen Saturation",
["timestamp", "oxygen saturation percentage", "data source"]
);
}
function fitbit_google_respiratory_rate_sleep(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/respiratory_rate_sleep_summary_[0-9]*.csv",
"Fitbit - Google Respiratory Rate Sleep Summary",
["timestamp", "deep sleep stats - milli breaths per minute", "deep sleep stats - standard deviation milli breaths per minute", "deep sleep stats - signal to noise", "light sleep stats - milli breaths per minute", "light sleep stats - standard deviation milli breaths per minute", "light sleep stats - signal to noise", "rem sleep stats - milli breaths per minute", "rem sleep stats - standard deviation milli breaths per minute", "rem sleep stats - signal to noise", "full sleep stats - milli breaths per minute", "full sleep stats - standard deviation milli breaths per minute", "full sleep stats - signal to noise", "data source"]
);
}
function fitbit_google_sedentary_period(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/sedentary_period_[0-9]*.csv",
"Fitbit - Google Sedentary Period",
["start time", "end time", "data source"]
);
}
function fitbit_google_steps(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/steps_[0-9]*.csv",
"Fitbit - Google Steps",
["timestamp", "steps", "data source"]
);
}
function fitbit_google_swim_lengths(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/swim_lengths_data_[0-9]*.csv",
"Fitbit - Google Swim Lengths",
["timestamp", "lap time", "stroke count", "stroke type", "data source"]
);
}
function fitbit_google_time_in_hr_zone(): PipelineOp {
return csvMany(
"Google Data/Physical Activity/time_in_heart_rate_zone_[0-9]*.csv",
"Fitbit - Google Time in HR Zone",
["timestamp", "heart rate zone type", "data source"]
);
}
// Single (non-date-suffixed) Google Physical Activity files
function fitbit_google_cardio_acute_chronic(): PipelineOp {
return csvOne("Google Data/Physical Activity/cardio_acute_chronic_workload_ratio.csv", "Fitbit - Google Cardio Acute Chronic Workload Ratio");
}
function fitbit_google_cardio_load_observed(): PipelineOp {
return csvOne("Google Data/Physical Activity/cardio_load_observed_interval.csv", "Fitbit - Google Cardio Load Observed Interval");
}
function fitbit_google_daily_hrv(): PipelineOp {
return csvOne("Google Data/Physical Activity/daily_heart_rate_variability.csv", "Fitbit - Google Daily Heart Rate Variability");
}
function fitbit_google_daily_hr_zones(): PipelineOp {
return csvOne("Google Data/Physical Activity/daily_heart_rate_zones.csv", "Fitbit - Google Daily Heart Rate Zones");
}
function fitbit_google_daily_readiness(): PipelineOp {
return csvOne("Google Data/Physical Activity/daily_readiness.csv", "Fitbit - Google Daily Readiness");
}
function fitbit_google_daily_respiratory_rate(): PipelineOp {
return csvOne("Google Data/Physical Activity/daily_respiratory_rate.csv", "Fitbit - Google Daily Respiratory Rate");
}
function fitbit_google_daily_resting_hr(): PipelineOp {
return csvOne("Google Data/Physical Activity/daily_resting_heart_rate.csv", "Fitbit - Google Daily Resting Heart Rate");
}
function fitbit_google_demographic_vo2max(): PipelineOp {
return csvOne("Google Data/Physical Activity/demographic_vo2max.csv", "Fitbit - Google Demographic VO2 Max");
}
function fitbit_google_height(): PipelineOp {
return csvOne("Google Data/Physical Activity/height.csv", "Fitbit - Google Height");
}
function fitbit_google_weight(): PipelineOp {
return csvOne("Google Data/Physical Activity/weight.csv", "Fitbit - Google Weight");
}
// ============================================================
// Main export
// ============================================================
export function fitbit(): PipelineOp {
return pipe(
assignMeta({ idValue: t => `Fitbit - ${t.basename}` }),
branchGen(function* () {
// Application
yield fitbit_account_access_events();
yield fitbit_account_management_events();
yield fitbit_email_audit();
yield fitbit_retired_passwords();
// Heart
yield fitbit_afib_enrollment();
yield fitbit_hr_notification_alerts();
yield fitbit_hr_notification_profile();
// Menstrual Health
yield fitbit_menstrual_cycles();
yield fitbit_menstrual_symptoms();
yield fitbit_menstrual_birth_control();
yield fitbit_menstrual_settings();
// Other
yield fitbit_oxygen_variation();
// Personal & Account
yield fitbit_profile();
yield fitbit_devices();
yield fitbit_trackers();
yield fitbit_scales();
yield fitbit_tracker_config();
yield fitbit_ios_notifications();
yield fitbit_height();
yield fitbit_weight();
// Physical Activity
yield fitbit_active_zone_minutes();
yield fitbit_activity_goals();
yield fitbit_daily_readiness();
yield fitbit_calories();
yield fitbit_vo2_max();
yield fitbit_distance();
yield fitbit_exercises();
yield fitbit_heart_rate();
yield fitbit_lightly_active_minutes();
yield fitbit_moderately_active_minutes();
yield fitbit_resting_heart_rate();
yield fitbit_sedentary_minutes();
yield fitbit_steps();
yield fitbit_swim_lengths();
yield fitbit_time_in_hr_zones();
yield fitbit_very_active_minutes();
// Sleep
yield fitbit_sleep();
yield fitbit_sleep_score();
yield fitbit_daily_spo2();
yield fitbit_minute_spo2();
yield fitbit_device_temperature();
// Social
yield fitbit_badges();
// Stress
yield fitbit_stress_score();
// Google Data / Health Fitness Data
yield fitbit_google_calibration();
yield fitbit_google_goal_settings();
yield fitbit_google_irn_state();
yield fitbit_google_app_settings();
yield fitbit_google_demographic();
yield fitbit_google_legacy_settings();
yield fitbit_google_mbd();
yield fitbit_google_profile();
yield fitbit_google_exercises();
yield fitbit_google_sleep_scores();
yield fitbit_google_sleep_stages();
yield fitbit_google_sleeps();
// Google Data / Physical Activity (date-suffixed)
yield fitbit_google_active_minutes();
yield fitbit_google_active_zone_minutes();
yield fitbit_google_activity_level();
yield fitbit_google_body_temperature();
yield fitbit_google_calories();
yield fitbit_google_calories_in_hr_zone();
yield fitbit_google_cardio_load();
yield fitbit_google_daily_oxygen_saturation();
yield fitbit_google_distance();
yield fitbit_google_heart_rate();
yield fitbit_google_heart_rate_variability();
yield fitbit_google_live_pace();
yield fitbit_google_oxygen_saturation();
yield fitbit_google_respiratory_rate_sleep();
yield fitbit_google_sedentary_period();
yield fitbit_google_steps();
yield fitbit_google_swim_lengths();
yield fitbit_google_time_in_hr_zone();
// Google Data / Physical Activity (single files)
yield fitbit_google_cardio_acute_chronic();
yield fitbit_google_cardio_load_observed();
yield fitbit_google_daily_hrv();
yield fitbit_google_daily_hr_zones();
yield fitbit_google_daily_readiness();
yield fitbit_google_daily_respiratory_rate();
yield fitbit_google_daily_resting_hr();
yield fitbit_google_demographic_vo2max();
yield fitbit_google_height();
yield fitbit_google_weight();
})
);
}

View file

@ -1,107 +1,115 @@
import { TaskTargetPipelineHelper } from "./task.ts"; import { pipe, branch, cmd, assignMeta, cd, glob, read, branchGen, type PipelineOp } from "./task.ts";
import { htmlSelectorChunkedDuplex } from "./html.ts"; import { htmlSelectorChunkedDuplex } from "./html.ts";
export function google(this: TaskTargetPipelineHelper){ export function google(){
const p = this.setId(t=>`Google - ${t.basename}`); // Generic ID for everything in here return pipe(
const col: Set<TaskTargetPipelineHelper> = new Set(); // Generic ID for everything in here
assignMeta({ idValue: t=>`Google - ${t.basename}` }),
// TODO: There is a root takeout folder branchGen(function*() {
// TODO: There is a root takeout folder
p.collect(col).cd('Access Log Activity/Activities - A list of Google services accessed by.csv').read() yield pipe(cd('Access Log Activity/Activities - A list of Google services accessed by.csv'), read())
p.collect(col).cd('Devices - A list of devices (i.e. Nest, Pixel, iPh.csv').read() yield pipe(cd('Devices - A list of devices (i.e. Nest, Pixel, iPh.csv'), read())
// Assignments - data was empty // Assignments - data was empty
// Business messages - GMB messages, there's some but so far outside of what I want // Business messages - GMB messages, there's some but so far outside of what I want
// TODO: Calendar, exports an .ics // TODO: Calendar, exports an .ics
// a = t.fork().cd(`Chrome`) // a = t.fork().cd(`Chrome`)
// TODO: Assersses and mode.json // TODO: Assersses and mode.json
// TODO: Bookmarks.csv // TODO: Bookmarks.csv
// TODO: Device Information.json // TODO: Device Information.json
// TODO: Dictionary.csv // TODO: Dictionary.csv
// TODO: ... // TODO: ...
p.collect(col).cd('Chrome/History.json') yield pipe(
.read() cd('Chrome/History.json'),
// TODO: Typed Url", no data read(),
// TODO: "session", complex data // TODO: Typed Url", no data
// Omitted .ptoken and .client_id for now. I think ptoken is maybe for the history API? client_id is base64 something... // TODO: "session", complex data
// TODO: time_usec IS WRONG!! Needs to be ms // Omitted .ptoken and .client_id for now. I think ptoken is maybe for the history API? client_id is base64 something...
.cmd(["jq", "-r", `["favicon_url","page_transition","title","url","time_usec"], // TODO: time_usec IS WRONG!! Needs to be ms
( cmd(["jq", "-r", `["favicon_url","page_transition","title","url","time_usec"],
."Browser History"[] (
| [.favicon_url, .page_transition, .title, .url, (.time_usec | todateiso8601)] ."Browser History"[]
) | [.favicon_url, .page_transition, .title, .url, (.time_usec | todateiso8601)]
| @csv`]) )
| @csv
`])
);
// TODO: Contactss, exports an .vcf // TODO: Contactss, exports an .vcf
// TODO: ... // TODO: ...
// a = t.fork().cd(`Google Pay`) // a = t.fork().cd(`Google Pay`)
p.collect(col).cd(`Google Pay/Google transactions`).glob(`transactions_*.csv`) yield pipe(
.read() cd(`Google Pay/Google transactions`),
.csvSink() glob(`transactions_*.csv`),
// .fork("a").cd(`Money sends and requests`) read(),
// .fork().cd(`Money sends and requests.csv`) // .fork("a").cd(`Money sends and requests`)
// .read() // .fork().cd(`Money sends and requests.csv`)
// .cmd(t=>["sqlite-utils", "insert", "your.db", t.basename, "-", "--csv", "--detect-types"]) // .read()
// TODO: One more folder, and it only has a pdf // .cmd(t=>["sqlite-utils", "insert", "your.db", t.basename, "-", "--csv", "--detect-types"])
// TODO: One more folder, and it only has a pdf
);
// TODO: Google Play Movies _ TV - no data // TODO: Google Play Movies _ TV - no data
// TODO: ... // TODO: ...
p.collect(col).cd("Location History/Location History.json") yield pipe(
.read() cd("Location History/Location History.json"),
// TODO: This is missing read(),
// "altitude" : 158, // TODO: This is missing
// "verticalAccuracy" : 68 // "altitude" : 158,
// and the activity models. I had no idea google tries to determine if I'm "tilting" // "verticalAccuracy" : 68
.cmd(["jq", "-r", `["timestamp","latitudeE7","longitudeE7","accuracy"], // and the activity models. I had no idea google tries to determine if I'm "tilting"
( cmd(["jq", "-r", `["timestamp","latitudeE7","longitudeE7","accuracy"],
.locations[] (
| [.timestampMs | todateiso8601, .latitudeE7, .longitudeE7, .accuracy] .locations[]
) | [.timestampMs | todateiso8601, .latitudeE7, .longitudeE7, .accuracy]
| @csv`]) )
.csvSink() | @csv
// There's also the semantic history but that's an entire nother can of worms `])
// it seems like );
// There's also the semantic history but that's an entire nother can of worms
// it seems like
// TODO: Needs no-headers! // TODO: Needs no-headers!
// a = t.fork().cd(`My Activity`) // a = t.fork().cd(`My Activity`)
// a.fork().glob(`**/MyActivity.html`) // a.fork().glob(`**/MyActivity.html`)
// .setId(t=>`Google - ${t.basenameN(2)}`) // .setId(t=>`Google - ${t.basenameN(2)}`)
// .read() // .read()
// .pipe(()=>{ // .pipe(()=>{
// // Parses the MyActivity format, chunking it into pieces of HTML text // // Parses the MyActivity format, chunking it into pieces of HTML text
// // and then parsing out the text // // and then parsing out the text
// const dup = htmlSelectorChunkedDuplex( // const dup = htmlSelectorChunkedDuplex(
// (tag, attrs)=>{ // (tag, attrs)=>{
// // TODO: We also probably want to get and parse each // // TODO: We also probably want to get and parse each
// // ".content-cell.mdl-typography--caption" as well (it // // ".content-cell.mdl-typography--caption" as well (it
// // has location for websearches and sometimes a details field) // // has location for websearches and sometimes a details field)
// // but then we have to get ".mdl-grid" and parse it // // but then we have to get ".mdl-grid" and parse it
// return attrs.class?.includes("content-cell") // return attrs.class?.includes("content-cell")
// && attrs.class?.includes("mdl-typography--body-1") // && attrs.class?.includes("mdl-typography--body-1")
// && !attrs.class?.includes("mdl-typography--text-right") // && !attrs.class?.includes("mdl-typography--text-right")
// }, // },
// (chunk)=>{ // (chunk)=>{
// const text = chunk.innerText; // const text = chunk.innerText;
// const split = text.split("\n"); // const split = text.split("\n");
// const timestamp = split.pop(); // TODO: need to parse this // const timestamp = split.pop(); // TODO: need to parse this
// const rest = split.join("\n"); // const rest = split.join("\n");
// // TODO: Escape instead of replace // // TODO: Escape instead of replace
// const restSafe = rest.replace(/"/g, "'").replace(/\n/g,"\\n"); // escape newlines and quotes // const restSafe = rest.replace(/"/g, "'").replace(/\n/g,"\\n"); // escape newlines and quotes
// // Return a CSV // // Return a CSV
// return `"${restSafe}","${timestamp}"\n`; // return `"${restSafe}","${timestamp}"\n`;
// } // }
// ); // );
// return dup; // return dup;
// }) // })
// TODO: News // TODO: News
// TODO: Profile // TODO: Profile
// TODO: Tasks - No data // TODO: Tasks - No data
})
return Array.from(col); );
}; };

52
data-export/io.ts Normal file
View file

@ -0,0 +1,52 @@
import fs from 'node:fs/promises';
import fsSync from 'node:fs';
import { DatabaseSync } from "node:sqlite";
import { type ProcessOutputAggregate, type RunOutput, TaskTarget, runAll, type ProcessOutputSimple } from "./task.ts";
import { ProcessOutput } from 'zx';
async function loadCSVTable(
db: DatabaseSync,
target: TaskTarget,
result: ProcessOutput | ProcessOutputAggregate | ProcessOutputSimple
) {
const id = target.id;
const table = id;
const tmpPath = `/tmp/${id}.csv`;
// console.log(`Writing ${tmpPath}`);
const fd = await fs.open(tmpPath, 'w');
await fs.writeFile(fd, result.stdout, { encoding: 'utf8' });
await fd.close();
// console.log(`Loading ${tmpPath} → table ${table}`);
db.exec(`CREATE VIRTUAL TABLE temp.intermediate USING csv(filename='${tmpPath}', header);`);
db.exec(`CREATE TABLE "${table}" AS SELECT * FROM intermediate;`);
db.exec(`DROP TABLE IF EXISTS intermediate;`);
return;
}
// TODO: This should really have the same name throughout the codebase?
export const runPipeline = runAll;
/**
* @param db Must be a DatabaseSync with the csv.so extension enabled
*/
export async function loadIntoDb(db: DatabaseSync, runOutput: RunOutput[]) {
for (const {result, target} of runOutput) {
await loadCSVTable(db, target, result);
}
}
export function getDefaultDB(): DatabaseSync {
const db = new DatabaseSync(":memory:", { allowExtension: true });
db.loadExtension("/home/cobertos/sqlite-files/csv.so")
db.enableLoadExtension(false);
return db;
}
export async function dumpDBToDisk(db: DatabaseSync, dumpPath: string) {
if (fsSync.existsSync(dumpPath)) {
await fs.unlink(dumpPath); // unlink the old
}
// Dump it all to the path specified
db.exec(`VACUUM main INTO '${dumpPath}'`);
}

View file

@ -1,17 +1,18 @@
import { $, type ProcessOutput } from 'zx';
import os from 'os'; import os from 'os';
import { type TaskTarget, run } from "./task.ts";
$.verbose = false; /**Generic parallel runner with optional logging
* Runs `targets` with `runFn` up to a maximum of `maxConcurrency` amount at a time
type ResultMap = Map<string, ProcessOutput>; * Shaped in a way that expects generally something that returns zx.ProcessOutput (or
* something with .duration and .ok built-in to the return)
export async function parallel( * @param runFn Should NOT throw. Return { ok: false } instead
targets: TaskTarget[], */
export async function parallel<T, R extends { duration: number, ok: boolean }>(
targets: T[],
runFn: (t: T)=>Promise<R>,
quiet: boolean = false, quiet: boolean = false,
maxConcurrency: number = os.cpus().length maxConcurrency: number = os.cpus().length
): Promise<ResultMap> { ): Promise<R[]> {
const results = new Map<string, ProcessOutput>(); const resultMap = new Map<T, R>();
const total = targets.length; const total = targets.length;
let completed = 0; let completed = 0;
@ -42,14 +43,14 @@ export async function parallel(
process.stderr.write(`\r${formatEta()}`.padEnd(80)); process.stderr.write(`\r${formatEta()}`.padEnd(80));
} }
async function runJob(t: TaskTarget): Promise<void> { async function runJob(t: T): Promise<void> {
running++; running++;
printStatus(); printStatus();
const result = await run(t); const result = await runFn(t);
completionTimes.push(result.duration); completionTimes.push(result.duration);
results.set(t.id, result); resultMap.set(t, result);
running--; running--;
completed++; completed++;
@ -77,10 +78,18 @@ export async function parallel(
// Final status line // Final status line
process.stderr.write('\n'); process.stderr.write('\n');
const totalSeconds = ((Date.now() - startTime) / 1000).toFixed(1); const totalSeconds = ((Date.now() - startTime) / 1000).toFixed(1);
const failed = Array.from(results.values().filter(p => !p.ok)); const failed = Array.from(resultMap.values().filter(p => !p.ok));
process.stderr.write( if (!quiet) {
`\nCompleted ${total} jobs in ${totalSeconds}s (${failed.length} failed)\n` process.stderr.write(
); `\nCompleted ${total} jobs in ${totalSeconds}s (${failed.length} failed)\n`
);
}
return results; const output = targets
.map(t => {
const r = resultMap.get(t)!;
return r;
});
return output;
} }

260
data-export/snapchat.ts Normal file
View file

@ -0,0 +1,260 @@
import { pipe, cmd, assignMeta, cd, read, branchGen, type PipelineOp } from "./task.ts";
/**
* jq helper to normalize Snapchat's "YYYY-MM-DD HH:MM:SS UTC" timestamps
* to ISO 8601 "YYYY-MM-DDTHH:MM:SS+00:00". Passes through empty strings/nulls.
*/
const SNAPISO = `def snapiso: if . == null or . == "" then "" elif endswith(" UTC") then ((.[:-4] | gsub(" "; "T")) + "+00:00") else . end;`;
/** Login events from account.json */
function snapchat_login_history(): PipelineOp {
return pipe(
cd(`json/account.json`), read(),
cmd(["jq", "-r", `${SNAPISO}
["ip", "country", "created", "status", "device"],
(
.["Login History"][]
| [.IP, .Country, (.Created | snapiso), .Status, .Device]
)
| @csv
`]),
assignMeta({
idValue: "Snapchat - Login History",
columnMeta: ["text", "text", "isodatetime", "text", "text"],
perRowDescription: 'Login from {0} ({1}) on {2}',
perRowTags: "snapchat,security",
})
);
}
/**
* Account changes over time from account_history.json.
* Flattens display name changes, email changes, password changes,
* bitmoji links, and data download requests into one table.
*/
function snapchat_account_history(): PipelineOp {
return pipe(
cd(`json/account_history.json`), read(),
cmd(["jq", "-r", `${SNAPISO}
["change_type", "date", "detail"],
(
(
(.["Display Name Change"][]? | {t: "display_name_change", d: .Date, v: ."Display Name"}),
(.["Email Change"][]? | {t: "email_change", d: .Date, v: ."Email Address"}),
(.["Password Change"][]? | {t: "password_change", d: .Date, v: ""}),
(.["Snapchat Linked to Bitmoji"][]? | {t: "linked_to_bitmoji", d: .Date, v: ""}),
(.["Download My Data Reports"][]? | {t: "data_download", d: .Date, v: (.Status + " / " + ."Email Address")})
)
| [.t, (.d | snapiso), .v]
)
| @csv
`]),
assignMeta({
idValue: "Snapchat - Account History",
columnMeta: ["text", "isodatetime", "text"],
perRowDescription: '{0} on {1}: {2}',
perRowTags: "snapchat,security",
})
);
}
/**
* All friend relationship types from friends.json combined into one table.
* relationship_type column distinguishes Friends, Blocked Users, etc.
*/
function snapchat_friends(): PipelineOp {
return pipe(
cd(`json/friends.json`), read(),
cmd(["jq", "-r", `${SNAPISO}
["relationship_type", "username", "display_name", "created_at", "modified_at", "source"],
(
["Friends", "Friend Requests Sent", "Blocked Users", "Deleted Friends", "Ignored Snapchatters", "Pending Requests"][]
as $key
| .[$key][]?
| [$key, .Username, ."Display Name", (."Creation Timestamp" | snapiso), (."Last Modified Timestamp" | snapiso), .Source]
)
| @csv
`]),
assignMeta({
idValue: "Snapchat - Friends",
columnMeta: ["text", "text", "text", "isodatetime", "isodatetime", "text"],
perRowDescription: '{0}: {2} (@{1}) since {3}',
perRowTags: "snapchat",
})
);
}
/**
* All chat messages from chat_history.json.
* Keys in that file are friend usernames; they become the conversation_with column.
*/
function snapchat_chat_history(): PipelineOp {
return pipe(
cd(`json/chat_history.json`), read(),
cmd(["jq", "-r", `${SNAPISO}
["conversation_with", "from", "media_type", "created", "content", "is_sender"],
(
to_entries[]
| .key as $conv
| .value[]
| [$conv, .From, ."Media Type", (.Created | snapiso), (.Content // ""), (.IsSender | tostring)]
)
| @csv
`]),
assignMeta({
idValue: "Snapchat - Chat History",
columnMeta: ["text", "sender", "text", "isodatetime", "text", "any"],
perRowDescription: '"{4}" from {1} in {0} at {3}',
perRowTags: "snapchat,message",
})
);
}
/** Approximate visited areas from location_history.json */
function snapchat_location_visits(): PipelineOp {
return pipe(
cd(`json/location_history.json`), read(),
cmd(["jq", "-r", `
["time", "city", "region", "postal_code"],
(
.["Areas you may have visited in the last two years"][]
| [.Time, .City, .Region, ."Postal Code"]
)
| @csv
`]),
assignMeta({
idValue: "Snapchat - Location Visits",
columnMeta: ["any", "text", "text", "any"],
perRowDescription: 'Visited {1}, {2} ({3}) around {0}',
perRowTags: "snapchat,location",
})
);
}
/** Spotlight/story activity from shared_story.json */
function snapchat_spotlight(): PipelineOp {
return pipe(
cd(`json/shared_story.json`), read(),
cmd(["jq", "-r", `${SNAPISO}
["story_date", "story_url", "action_type", "view_time"],
(
.["Spotlight History"][]
| [(.["Story Date"] | snapiso), .["Story URL"], .["Action Type"], .["View Time"]]
)
| @csv
`]),
assignMeta({
idValue: "Snapchat - Spotlight",
columnMeta: ["isodatetime", "url", "text", "any"],
perRowDescription: '{2} on spotlight at {0}',
perRowTags: "snapchat",
})
);
}
/** Terms of service acceptance history from terms_history.json */
function snapchat_terms_history(): PipelineOp {
return pipe(
cd(`json/terms_history.json`), read(),
cmd(["jq", "-r", `${SNAPISO}
["version", "acceptance_date"],
(
.["Snap Inc. Terms of Service"][]
| [.Version, (.["Acceptance Date"] | snapiso)]
)
| @csv
`]),
assignMeta({
idValue: "Snapchat - Terms History",
columnMeta: ["text", "isodatetime"],
perRowDescription: 'Accepted terms {0} on {1}',
perRowTags: "snapchat",
})
);
}
/** Third-party app permissions from connected_apps.json */
function snapchat_connected_apps(): PipelineOp {
return pipe(
cd(`json/connected_apps.json`), read(),
cmd(["jq", "-r", `${SNAPISO}
["app", "time", "type"],
(
.Permissions[]
| [.App, (.Time | snapiso), .Type]
)
| @csv
`]),
assignMeta({
idValue: "Snapchat - Connected App Permissions",
columnMeta: ["text", "isodatetime", "text"],
perRowDescription: '{2} permission for {0} on {1}',
perRowTags: "snapchat",
})
);
}
/** Email campaign subscription preferences from email_campaign_history.json */
function snapchat_email_campaigns(): PipelineOp {
return pipe(
cd(`json/email_campaign_history.json`), read(),
cmd(["jq", "-r", `
["campaign", "opt_out_status"],
(
.["Email Campaign Subscriptions"][]
| [.["Email Campaign"], .["Opt Out Status"]]
)
| @csv
`]),
assignMeta({
idValue: "Snapchat - Email Campaigns",
columnMeta: ["text", "text"],
perRowDescription: 'Email campaign "{0}": {1}',
perRowTags: "snapchat",
})
);
}
/**
* In-app survey responses from in_app_surveys.json.
* Keys are dynamic dates ("Survey YYYY/MM/DD"), flattened via to_entries.
*/
function snapchat_surveys(): PipelineOp {
return pipe(
cd(`json/in_app_surveys.json`), read(),
cmd(["jq", "-r", `
["survey", "time", "question", "response"],
(
to_entries[]
| .key as $survey
| .value[]
| [$survey, .Time, .["Survey Question"], .["Survey Response"]]
)
| @csv
`]),
assignMeta({
idValue: "Snapchat - In-App Surveys",
columnMeta: ["text", "any", "text", "text"],
perRowDescription: 'Survey "{2}": {3}',
perRowTags: "snapchat",
})
);
}
export function snapchat(): PipelineOp {
return pipe(
assignMeta({ idValue: t => `Snapchat - ${t.basename}` }),
branchGen(function* () {
yield snapchat_login_history();
yield snapchat_account_history();
yield snapchat_friends();
yield snapchat_chat_history();
yield snapchat_location_visits();
yield snapchat_spotlight();
yield snapchat_terms_history();
yield snapchat_connected_apps();
yield snapchat_email_campaigns();
yield snapchat_surveys();
})
);
}

View file

@ -2,8 +2,10 @@ import nodePath from 'node:path';
import fs from 'node:fs'; import fs from 'node:fs';
import { strict as assert } from "node:assert"; import { strict as assert } from "node:assert";
import { ZipFS } from "./zipFs.ts"; import { ZipFS } from "./zipFs.ts";
import { globSync } from "glob"; import { $, ProcessOutput, quote } from "zx";
import { $, ProcessPromise, quote } from "zx"; import { parallel } from "./parallel.ts";
$.verbose = false;
type FSImpl = { type FSImpl = {
isZip?: boolean; isZip?: boolean;
@ -11,6 +13,7 @@ type FSImpl = {
init?(): Promise<void>; init?(): Promise<void>;
ready?: boolean; ready?: boolean;
globSync: typeof fs["globSync"];
statSync: typeof fs["statSync"]; statSync: typeof fs["statSync"];
existsSync: typeof fs["existsSync"]; existsSync: typeof fs["existsSync"];
@ -38,19 +41,20 @@ function safe(s: string) {
interface TaskTargetOp { interface TaskTargetOp {
type: "read" | "mid"; type: "read" | "mid";
toShell(target: TaskTarget): string; toShell(target: TaskTarget): string | undefined;
clone(): TaskTargetOp; clone(): TaskTargetOp;
} }
class TaskTargetRead implements TaskTargetOp { class TaskTargetRead implements TaskTargetOp {
get type(){ return "read" as const; } get type(){ return "read" as const; }
toShell(target: TaskTarget) { toShell(target: TaskTarget) {
if (target.fsImpl.isZip) { if (target.fsImpl.isZip) {
// Read the file to stdout from the target inside the zip file
// This relies on the internals of fsImpl a bit to have the path to
// the root zip so we can create a command against it
assert(target.fsImpl.zipPath, "Should have a zipPath"); assert(target.fsImpl.zipPath, "Should have a zipPath");
// We need to be able to do this
return `7z x ${quote(target.fsImpl.zipPath)} -so ${quote(target.path)}`; return `7z x ${quote(target.fsImpl.zipPath)} -so ${quote(target.path)}`;
} }
// TODO : Implement when reading from a zip file
return `cat ${quote(target.path)}`; return `cat ${quote(target.path)}`;
} }
clone() { clone() {
@ -93,17 +97,67 @@ class TaskTargetCmd implements TaskTargetOp {
} }
type ValidId = string | ((t: TaskTarget)=>string); type ValidId = string | ((t: TaskTarget)=>string);
export const COLUMN_TYPES = {
/**A numeric value*/
"numeric": {},
/**ISO Datetime*/
"isodatetime": {},
/**Urls*/
"url": {},
/**Freetext*/
"text": {},
/**For anything untyped*/
"any": {},
/**The sender/originator of a row (maps to Owner in Timelinize)*/
"sender": {},
/**The receiver/recipient of a row (maps to RelSent entity in Timelinize)*/
"receiver": {},
/**Latitude coordinate*/
"lat": {},
/**Longitude coordinate*/
"lng": {},
"TODO": {}
};
/**Column metadata. Just a string into the TYPES*/
type ColumnMeta = (keyof typeof COLUMN_TYPES | undefined);
// Make non-optional version of just the metadata values of TaskTarget
type TaskTargetMeta = Required<Pick<TaskTarget, "idValue" | "perRowDescription" | "perRowTags" | "columnMeta" | "aggregate" | "metaIdValue" | "aggregateColumns">>;
export class TaskTarget { export class TaskTarget {
/**The current path pointed to by this TaskTarget*/
path: string; path: string;
/**The fsImpl used to access the .path*/
fsImpl: FSImpl = defaultFSImpl; fsImpl: FSImpl = defaultFSImpl;
/**The pipeline of things to do to the above path to get an stdout of the output*/
pipeline: TaskTargetOp[]; pipeline: TaskTargetOp[];
idValue: ValidId | undefined;
postFns: ((t: TaskTarget)=>Promise<void>)[]; // == Metadata, user configurable, no good defaults ==
/**Id of the TaskTarget
* string - Static id
* fn returning string - Function can derive the id from a task target even after a glob() and cd() operation
**/
idValue?: ValidId;
/**For every output CSV, this defines a description of that CSV per-row
* Use the items {0}, {1} to template
* Example: For a CSV with a row format like ["time", "sender", "sendee", "message"]
* you might do something like '"{3}" sent from {2} to {1}'
* */
perRowDescription?: string;
/**A CSV of tags that is added to every row of the table (TODO: no template functionality currently)*/
perRowTags?: string;
/**Metadata about the columns*/
columnMeta?: ColumnMeta[];
/**Whether or not to aggregate to a single task (everything with the id value idValue)*/
aggregate?: boolean;
/**Names of the columns to aggregate with*/
aggregateColumns?: string[];
/**A metadata TaskTarget for this TaskTarget, if one exists*/
metaIdValue?: ValidId;
constructor(path: string){ constructor(path: string){
this.path = path; this.path = path;
this.pipeline = []; this.pipeline = [];
this.postFns = [];
} }
exists() { exists() {
@ -136,6 +190,15 @@ export class TaskTarget {
} }
return safe(this.idValue); return safe(this.idValue);
} }
get metaId() {
if (!this.metaIdValue) {
return undefined;
}
if (typeof this.metaIdValue === "function") {
return safe(this.metaIdValue(this));
}
return safe(this.metaIdValue);
}
/**Changes the current directory of the target*/ /**Changes the current directory of the target*/
cd(path: string): TaskTarget { cd(path: string): TaskTarget {
@ -154,10 +217,7 @@ export class TaskTarget {
/**Get a glob off of the target*/ /**Get a glob off of the target*/
glob(globPath: string): TaskTarget[] { glob(globPath: string): TaskTarget[] {
globPath = this._joinPath(globPath); globPath = this._joinPath(globPath);
const items = globSync(globPath, { const items = this.fsImpl.globSync(globPath);
cwd: '/DUMMYCWD',
fs: this.fsImpl
});
const ret = items.map(i => new TaskTarget(i)); const ret = items.map(i => new TaskTarget(i));
// TODO: This should probably clone() // TODO: This should probably clone()
ret.forEach(t => t.fsImpl = this.fsImpl); // Should all use the same fsImpl ret.forEach(t => t.fsImpl = this.fsImpl); // Should all use the same fsImpl
@ -168,10 +228,16 @@ export class TaskTarget {
clone(): TaskTarget { clone(): TaskTarget {
const t = new TaskTarget(this.path); const t = new TaskTarget(this.path);
t.fsImpl = this.fsImpl; // holds no state, just needs same impl t.fsImpl = this.fsImpl; // holds no state, just needs same impl
t.idValue = this.idValue;
t.postFns = this.postFns.slice();
t.pipeline = this.pipeline.slice() t.pipeline = this.pipeline.slice()
.map(p => p.clone()); .map(p => p.clone());
// metadata
t.idValue = this.idValue;
t.perRowDescription = this.perRowDescription;
t.perRowTags = this.perRowTags;
t.columnMeta = this.columnMeta?.slice();
t.metaIdValue = this.metaIdValue;
t.aggregate = this.aggregate;
t.aggregateColumns = this.aggregateColumns?.slice();
return t; return t;
} }
@ -186,14 +252,11 @@ export class TaskTarget {
toShell() { toShell() {
const shell = this.pipeline const shell = this.pipeline
.map(p => p.toShell(this)) .map(p => p.toShell(this))
.filter(p => !!p) // remove empty strings and undefined
.join(" | ") .join(" | ")
return shell; return shell;
} }
pushPostFn(fn: ((t: TaskTarget)=>Promise<void>)) {
this.postFns.push(fn);
}
cmd(cmd: ValidCmd) { cmd(cmd: ValidCmd) {
this.pushToPipeline(new TaskTargetCmd(cmd)); this.pushToPipeline(new TaskTargetCmd(cmd));
return this; return this;
@ -202,92 +265,82 @@ export class TaskTarget {
this.pushToPipeline(new TaskTargetRead()); this.pushToPipeline(new TaskTargetRead());
return this; return this;
} }
setId(idValue: ValidId) { assignMeta(meta: Partial<TaskTargetMeta>) {
this.idValue = idValue; Object.assign(this, {
...meta,
// Clone this deeply so no shared object references
columnMeta: meta.columnMeta?.slice()
});
return this; return this;
} }
post(fn: any) {
this.pushPostFn(fn);
}
types(
types: string[]
) {
// TODO:
return this;
}
csvSink(
summarization?: [string, string][]
) {
// TODO:
return this;
// Ingest this csv into the database at the given id
// this.cmd(t=>["sqlite-utils", "insert", "your.db", t.id, "-", "--csv", "--detect-types"]);
// Add a post processing function for these targets that prints out the summarization
// stats
// this.post(async (t: TaskTarget)=>{
// // We only do the first one so far for the summarization
// let queryLine: string;
// let formatFn: (r: any)=>string;
// const [columnName, type] = summarization?.[0] ?? [undefined, undefined];
// if (type === "numeric") {
// queryLine = `min(${columnName}) as lo, max(${columnName}) as hi, count(*) as n`;
// formatFn = (r: any)=>`${r.n} rows from ${r.lo} to ${r.hi} for ${t.id}`;
// }
// else {
// queryLine = `count(*) as n`;
// formatFn = (r: any)=>`${r.n} rows for ${t.id}`;
// }
// const cmd = "sqlite-utils";
// const args = ["query", "your.db", `select ${queryLine} from ${t.id}`]
// const { stdout, stderr } = await execFile(cmd, args);
// const results = JSON.parse(stdout);
// const result = results[0]; // should only be one result in the array for this type of query
// const logLine = formatFn(result);
// (t as any).log = logLine;
// });
// return this;
}
} }
export function each(targets: TaskTarget[], fn: (t: TaskTarget)=>void) { export interface PipelineOp {
for (const t of targets) { (targets: TaskTarget[]): TaskTarget[] | Promise<TaskTarget[]>;
fn(t);
}
} }
export function map(targets: TaskTarget[], fn: (t: TaskTarget)=>TaskTarget) {
const newTargets = []; export function cd(path: string): PipelineOp {
for (const t of targets) { return (targets: TaskTarget[]) => targets.map(t => t.clone().cd(path));
newTargets.push(fn(t));
}
return newTargets;
} }
export function cd(targets: TaskTarget[], path: string): TaskTarget[] { export function glob(globPath: string): PipelineOp {
return targets.map(t => t.clone().cd(path)); return (targets: TaskTarget[]) => targets.map(t => t.glob(globPath)).flat();
} }
export function glob(targets: TaskTarget[], globPath: string): TaskTarget[] { export function unzip(): PipelineOp {
return targets.map(t => t.glob(globPath)).flat(); return async (targets: TaskTarget[]) => Promise.all(targets.map(t => t.unzip()));
} }
export async function unzip(targets: TaskTarget[]): Promise<TaskTarget[]> { export function read(): PipelineOp {
return Promise.all(targets.map(t => t.unzip())); return (targets: TaskTarget[]) => targets.map(t => t.clone().read())
} }
export function read(targets: TaskTarget[]): TaskTarget[] { export function cmd(cmd: ValidCmd): PipelineOp {
return targets.map(t => t.clone().read()) return (targets: TaskTarget[]) => targets.map(t => t.clone().cmd(cmd))
} }
export function cmd(targets: TaskTarget[], cmd: ValidCmd): TaskTarget[] { export function assignMeta(meta: Partial<TaskTargetMeta>): PipelineOp {
return targets.map(t => t.clone().cmd(cmd)) return (targets: TaskTarget[]) => targets.map(t => t.clone().assignMeta(meta))
} }
export function setId(targets: TaskTarget[], id: ValidId): TaskTarget[] {
return targets.map(t => t.clone().setId(id)) export function each(fn: (t: TaskTarget)=>TaskTarget): PipelineOp {
return (targets: TaskTarget[])=> targets.map(fn);
} }
export function pipe(...ops: PipelineOp[]): PipelineOp {
return async (targets: TaskTarget[]) => {
for (const op of ops) {
targets = await op(targets);
}
return targets;
};
}
export function branch(...ops: PipelineOp[]): PipelineOp {
return async (targets: TaskTarget[]) => {
const targetsArrays = await Promise.all(ops.map(op => op(targets)));
return targetsArrays.flat();
};
}
export function branchGen(genFn: ()=>Generator<PipelineOp>): PipelineOp {
const opsToBranch = Array.from(genFn());
return (targets: TaskTarget[]) => {
return branch(...opsToBranch)(targets);
};
}
export async function execPaths(entries: ({path: string, op: PipelineOp })[]) {
return (await Promise.all(
// Map every entry path into a TaskTarget and run the PipelineOp with
// that TaskTarget
entries
.map(async ({path,op})=>{
const targets = [new TaskTarget(path)];
return await op(targets);
})
)).flat();
}
/**Verify, anything that fails is skipped and throws an error*/ /**Verify, anything that fails is skipped and throws an error*/
export async function verify(targets: TaskTarget[]) { export async function verify(targets: TaskTarget[]) {
const outTargets: TaskTarget[] = []; const outTargets: TaskTarget[] = [];
for (const t of targets) { for (const t of targets) {
// Make sure fsImpl is ready // Make sure fsImpl is ready
// TODO: DO NOT PUT THIS IN VERIFY, this should go somewhere in the task building stuff...
if ("ready" in t.fsImpl && !t.fsImpl.ready && t.fsImpl.init) { if ("ready" in t.fsImpl && !t.fsImpl.ready && t.fsImpl.init) {
await t.fsImpl.init(); await t.fsImpl.init();
} }
@ -302,109 +355,133 @@ export async function verify(targets: TaskTarget[]) {
outTargets.push(t); outTargets.push(t);
} }
return outTargets; return outTargets;
} }
/**Writes a manifest for parallel, a TSV where each record is an id + the shell to run export interface ProcessOutputAggregate {
* @todo Enforce doing a verify before we output? stdout: string;
*/ stderr: string;
export function getTSVManifest(targets: TaskTarget[]): string { exitCodes: (number | null)[];
let out: string[] = []; duration: number;
for (const t of targets) { ok: boolean;
const shell = t.toShell(); }
out.push(`${t.id}\t${shell}`); export interface ProcessOutputSimple {
} stdout: string;
stderr: string;
return out.join("\n"); exitCode: number;
duration: number;
ok: boolean;
} }
export function getTaskManifest(targets: TaskTarget[]): [string, string][] { function combineProcessOutputAggregate(poa: ProcessOutputAggregate | undefined, t: TaskTarget, po: ProcessOutput) {
let out: [string, string][] = []; if (!poa) {
for (const t of targets) { assert(t.aggregateColumns, "aggregate TaskTarget must have aggregateColumns");
const shell = t.toShell(); const headers = t.aggregateColumns.join(",") + "\n";
out.push([t.id, shell] as const); return {
stdout: headers + po.stdout,
stderr: po.stderr,
exitCodes: [po.exitCode],
duration: po.duration,
ok: po.ok
};
} }
return out; // Comes with a builtin "\n" from jq on stdout and stderr, no need to add
// a trailing one
poa.stdout += po.stdout;
poa.stderr += po.stderr;
poa.exitCodes.push(po.exitCode);
poa.duration += po.duration;
poa.ok &&= po.ok;
return poa;
} }
function collectionSwap(a: TaskTargetPipelineHelper, b: TaskTargetPipelineHelper) { export interface RunOutput {
if (!a.__collection) { target: TaskTarget,
return; result: ProcessOutput | ProcessOutputAggregate | ProcessOutputSimple
}
// Remove a, add b
const collection = a.__collection;
delete a.__collection;
collection.delete(a);
b.__collection = collection;
collection.add(b);
} }
export class TaskTargetPipelineHelper extends Array<TaskTarget> { export async function run(target: TaskTarget): Promise<ProcessOutput> {
__collection?: Set<TaskTargetPipelineHelper>;
static pipeline(t: TaskTarget[]): TaskTargetPipelineHelper {
if (Object.getPrototypeOf(t) === TaskTargetPipelineHelper.prototype) {
return t as any; // Already done
}
Object.setPrototypeOf(t, TaskTargetPipelineHelper.prototype);
return t as any;
}
_fn(fn: (t: TaskTarget[])=>TaskTarget[]): TaskTargetPipelineHelper {
const p = TaskTargetPipelineHelper.pipeline(this);
const t = fn(p);
const p2 = TaskTargetPipelineHelper.pipeline(t);
collectionSwap(p, p2); // Move collection pointer to the new item, ends always end up in the collection
return p2;
}
async _afn(fn: (t: TaskTarget[])=>Promise<TaskTarget[]>): Promise<TaskTargetPipelineHelper> {
const p = TaskTargetPipelineHelper.pipeline(this);
const t = await fn(p);
const p2 = TaskTargetPipelineHelper.pipeline(t);
collectionSwap(p, p2); // Move collection pointer to the new item, ends always end up in the collection
return p2;
}
cd(path: string): TaskTargetPipelineHelper {
return this._fn(t => cd(t, path));
}
glob(globPath: string): TaskTargetPipelineHelper {
return this._fn(t => glob(t, globPath));
}
async unzip(): Promise<TaskTargetPipelineHelper> {
return this._afn(unzip);
}
read(): TaskTargetPipelineHelper {
return this._fn(read);
}
cmd(_cmd: ValidCmd): TaskTargetPipelineHelper {
return this._fn(t => cmd(t, _cmd));
}
setId(id: ValidId): TaskTargetPipelineHelper {
return this._fn(t => setId(t, id));
}
types(...args: any[]) {
// TODO: no-op
return this;
}
csvSink(...args: any[]) {
// TODO: no-op
return this;
}
/**
* @todo Nested versions of this don't currently work, but they could if we
* turn __collection into an array of collections
*/
collect(_c: Set<TaskTargetPipelineHelper>) {
this.__collection = _c;
return this;
}
}
export async function run(target: TaskTarget): Promise<ProcessPromise> {
const command = target.toShell(); const command = target.toShell();
return await $({ nothrow: true })`bash -c ${command}`; return await $({ nothrow: true })`bash -c ${command}`;
} }
export async function runAll(targets: TaskTarget[]): Promise<RunOutput[]> {
const finalTargets = await verify(targets);
const results = await parallel(finalTargets, run, true);
const nonAggregateTargets: TaskTarget[] = finalTargets.filter(t => !t.aggregate);
const nonAggregateResults: RunOutput[] = [];
const aggregateResultsMap: Record<string, RunOutput> = {};
// == Aggregate tables ==
// Some TaskTargets have .aggregate: true, which means they should all be combined
// into a single task with the id of the .id property
for (const [idx, r] of results.entries()) {
const t = finalTargets[idx];
if (!t.aggregate) {
nonAggregateResults.push({
target: t,
result: r
});
continue;
}
const aggregateId = t.id;
const prevResult = aggregateResultsMap[aggregateId]?.result;
aggregateResultsMap[aggregateId] = {
target: t, // Use target t for metadata, so it will use the last target
result: combineProcessOutputAggregate(prevResult as (ProcessOutputAggregate | undefined), t, r)
};
}
// == Metadata table ==
// Each TaskTarget has things like perRowDescription and other things we want to store
// and output. this creates a single TaskTarget for all that perTable metadata
function csvEscape(s: string | undefined) {
if (s === undefined) {
return "";
}
if (s.includes("\"") || s.includes(",") || s.includes("\n")) {
return `"${s.replace(/\"/g, "\"\"")}"`;
}
return s;
}
let metadataCSV = "id,perRowDescription,perRowTags,columnMeta,metaId\n";
for (const t of nonAggregateTargets) {
const tableNamePart = t.id;
const perRowDescriptionPart = t.perRowDescription;
const perRowTagsPart = t.perRowTags;
const columnMetaPart = t.columnMeta?.join(",") ?? "";
const metaIdPart = t.metaId;
metadataCSV += [
csvEscape(tableNamePart),
csvEscape(perRowDescriptionPart),
csvEscape(perRowTagsPart),
csvEscape(columnMetaPart),
csvEscape(metaIdPart)
].join(",") + "\n";
}
// Won't be removed by verify() because we're adding it after that's used
// TODO: Would be nice to bake this into TaskTarget/verify for tasks that dont point
// to a real path
const metadataTarget = new TaskTarget("<none>");
metadataTarget
// id, perRowDescription, perRowTags, columnMeta, metaId
.assignMeta({
idValue: "base_data_manager_metadata",
columnMeta: ["any", "any", "any", "any", "any"],
perRowTags: "internal",
});
const metadataResult= {
stdout: metadataCSV,
stderr: "",
exitCode: 0,
duration: 0, // TODO
ok: true
};
const metadataRunOutput: RunOutput = { target: metadataTarget, result: metadataResult };
const aggregateResults: RunOutput[] = Object.values(aggregateResultsMap);
return aggregateResults.concat(nonAggregateResults).concat(metadataRunOutput);
}

View file

@ -1,8 +1,7 @@
import { strict as assert } from "node:assert"; import { strict as assert } from "node:assert";
import fs from "node:fs";
import path from "node:path";
import { Readable } from "node:stream"; import { Readable } from "node:stream";
import yauzl from "yauzl"; import yauzl from "yauzl";
import { globSync } from "glob";
function removeDummyCwd(path: string) { function removeDummyCwd(path: string) {
if (path.startsWith("/DUMMYCWD/")) { if (path.startsWith("/DUMMYCWD/")) {
@ -309,6 +308,16 @@ export class ZipFS {
} }
} }
globSync(globPath: string) {
const selfImpl = this.getImpl();
return globSync(globPath, {
fs: selfImpl as any,
// We strip this later, this is so glob() doesn't use the cwd of the current
// process, which matches no files inside the .zip file
cwd: `/DUMMYCWD`
});
}
getImpl() { getImpl() {
// Because glob uses ...xxx notation to unpack ourselves into a _new_ object // Because glob uses ...xxx notation to unpack ourselves into a _new_ object
// we need to make sure that we DONT use a class, otherwise the properties // we need to make sure that we DONT use a class, otherwise the properties
@ -319,6 +328,7 @@ export class ZipFS {
init: this.init.bind(this), init: this.init.bind(this),
ready: this.ready, ready: this.ready,
globSync: this.globSync.bind(this),
statSync: this.statSync.bind(this), statSync: this.statSync.bind(this),
createReadStream: this.createReadStream.bind(this), createReadStream: this.createReadStream.bind(this),
createWriteStream: this.createWriteStream.bind(this), createWriteStream: this.createWriteStream.bind(this),

127
main.ts
View file

@ -1,67 +1,90 @@
import fs from 'node:fs/promises'; import { type DatabaseSync } from "node:sqlite";
import nodePath from "node:path"; import { fileURLToPath } from "node:url";
import { DatabaseSync } from "node:sqlite";
import "./data-export/facebook.ts";
import { google } from "./data-export/google.ts"; import { google } from "./data-export/google.ts";
import { TaskTargetPipelineHelper } from "./data-export/task.ts"; import { facebook, facebook_v2 } from "./data-export/facebook.ts";
import { type TaskTarget, execPaths } from "./data-export/task.ts";
import * as DataIO from "./data-export/io.ts";
declare module "./data-export/task.ts" { const __filename = fileURLToPath(import.meta.url);
interface TaskTargetPipelineHelper {
google: typeof google;
}
}
Object.assign(TaskTargetPipelineHelper.prototype, { export const startTime = Date.now();
google export const elapsed = ()=>`${((Date.now() - startTime) / 1000).toFixed(2)}s`;
});
function loadIntoSqlite( export async function loadTaskInNewDb(targets: TaskTarget[]): Promise<DatabaseSync> {
paths: string[], console.log(`${elapsed()} - Run all targets`);
sqlitePath: string const out = await DataIO.runPipeline(targets);
) { console.log(`${elapsed()} - Final targets exported to CSV. Got ${out.length} targets`);
// Open an in-memory db for speed
const db = new DatabaseSync(":memory:", { allowExtension: true });
db.loadExtension("/home/cobertos/sqlite-files/csv.so")
db.enableLoadExtension(false);
for (const path of paths) {
const table = nodePath.basename(path, ".csv");
console.log(`Loading ${path} → table ${table}`);
// const headers = lines[0].split(","); // TODO: Add an option to output everything plainly as CSV in a single directory
// const columnsSql = headers.map(h => `"${h}" TEXT`).join(", ");
db.exec(`CREATE VIRTUAL TABLE temp.intermediate USING csv(filename='${path}');`);
db.exec(`CREATE TABLE "${table}" AS SELECT * FROM intermediate;`);
db.exec(`DROP TABLE IF EXISTS intermediate;`);
}
// Dump it all to the path specified console.log(`${elapsed()} - Building combined database table in :memory:`);
db.exec(`VACUUM main INTO '${sqlitePath}'`); const db = DataIO.getDefaultDB();
db.close(); await DataIO.loadIntoDb(db, out);
const tableCount = db.prepare(`SELECT COUNT(*) as count FROM base_data_manager_metadata`).get()!.count;
console.log(`${elapsed()} - Single database built with ${tableCount} tables`);
return db;
} }
async function main() { async function main() {
const t = TaskTargetPipelineHelper; // Configurable stuff
// TODO: const sqlitePath = 'your.db';
// t.fork().cd("/home/cobertos/Seafile/archive/ExportedServiceData/facebook/formapcast_facebook-DEADNAME-May2021-json")
// .facebook()
// (await t.fork().cd("/home/cobertos/Seafile/archive/ExportedServiceData/facebook/facebook-x-2025-11-29-x.zip").zip()).facebook_v2(); console.log(`${elapsed()} - Building targets`);
const targets = await execPaths([
{path: "/home/cobertos/Seafile/archive/ExportedServiceData/facebook/formapcast_facebook-DEADNAME-May2021-json", op: facebook()}
// {path: "/home/cobertos/Seafile/projects/base-data-manager/test/fixtures/facebook-json-2021-05-01", op: facebook()}
// {path: "/home/cobertos/Seafile/archive/ExportedServiceData/facebook/facebook-x-2025-11-29-x.zip", op: pipe(unzip(), facebook_v2())}
// {path: "/home/cobertos/Seafile/archive/ExportedServiceData/google/2023-NAMEwork-001", op: facebook_v2()}
]);
console.log(`${elapsed()} - Found ${targets.filter(t => !t.aggregate).length} possible targets`);
// t.fork().cd("/home/cobertos/Seafile/archive/ExportedServiceData/google/2023-NAMEwork-001") const db = await loadTaskInNewDb(targets);
// .google()
console.log(`${elapsed()} - Writing database to disk at "${sqlitePath}"`);
DataIO.dumpDBToDisk(db, sqlitePath);
// let zipTask = t.fork().zip("/home/cobertos/Seafile/archive/ExportedServiceData/facebook/facebook-DEADNAME-May2021-json.zip"); console.log(`${elapsed()} - Database written to disk`);
// await (zipTask.fsImpl as any).init();
// zipTask.facebook();
// Now take the output and load it all into a single SQLITE file
// const entries = await fs.readdir('OUTTEST', { withFileTypes: true });
// const csvFiles = entries
// .filter(e => e.isFile() && e.name.endsWith(".csv"))
// .map(e => nodePath.join('OUTTEST', e.name));
// await fs.unlink('your.db');
// loadIntoSqlite(csvFiles, 'your.db');
} }
main(); if (process.argv[1] === __filename) {
main();
}
// TODO: Move this into here
// csvSink(
// summarization?: [string, string][]
// ) {
// // TODO:
// return this;
// // Ingest this csv into the database at the given id
// // this.cmd(t=>["sqlite-utils", "insert", "your.db", t.id, "-", "--csv", "--detect-types"]);
// // Add a post processing function for these targets that prints out the summarization
// // stats
// // this.post(async (t: TaskTarget)=>{
// // // We only do the first one so far for the summarization
// // let queryLine: string;
// // let formatFn: (r: any)=>string;
// // const [columnName, type] = summarization?.[0] ?? [undefined, undefined];
// // if (type === "numeric") {
// // queryLine = `min(${columnName}) as lo, max(${columnName}) as hi, count(*) as n`;
// // formatFn = (r: any)=>`${r.n} rows from ${r.lo} to ${r.hi} for ${t.id}`;
// // }
// // else {
// // queryLine = `count(*) as n`;
// // formatFn = (r: any)=>`${r.n} rows for ${t.id}`;
// // }
// // const cmd = "sqlite-utils";
// // const args = ["query", "your.db", `select ${queryLine} from ${t.id}`]
// // const { stdout, stderr } = await execFile(cmd, args);
// // const results = JSON.parse(stdout);
// // const result = results[0]; // should only be one result in the array for this type of query
// // const logLine = formatFn(result);
// // (t as any).log = logLine;
// // });
// // return this;
// }

View file

@ -6,8 +6,9 @@
"type": "module", "type": "module",
"scripts": { "scripts": {
"test": "node --enable-source-maps --test --experimental-transform-types --no-warnings ./test/task.ts", "test": "node --enable-source-maps --test --experimental-transform-types --no-warnings ./test/task.ts",
"test2": "node --enable-source-maps --test --experimental-transform-types --no-warnings ./test/facebook.ts", "test-scrub": "node --enable-source-maps --test --experimental-transform-types --no-warnings ./test/scrub.ts",
"test-update-snapshots": "node --enable-source-maps --test --experimental-transform-types --no-warnings --test-update-snapshots ./test/facebook.ts", "test-exports": "node --enable-source-maps --test --experimental-transform-types --no-warnings ./test/data-export.ts",
"test-exports-snapshots": "node --enable-source-maps --test --experimental-transform-types --no-warnings --test-update-snapshots ./test/data-export.ts",
"dev": "vite --port 2223", "dev": "vite --port 2223",
"server": "node --experimental-transform-types server/server.ts", "server": "node --experimental-transform-types server/server.ts",
"prototype": "node --import ./util/tsx-loader.js --import ./util/ignore-css-loader.js --experimental-transform-types server/prototype.ts" "prototype": "node --import ./util/tsx-loader.js --import ./util/ignore-css-loader.js --experimental-transform-types server/prototype.ts"
@ -20,7 +21,6 @@
"@types/duplexify": "^3.6.5", "@types/duplexify": "^3.6.5",
"@types/yauzl": "^2.10.3", "@types/yauzl": "^2.10.3",
"duplexify": "^4.1.3", "duplexify": "^4.1.3",
"fp-ts": "^2.16.11",
"glob": "^13.0.0", "glob": "^13.0.0",
"htmlparser2": "^10.0.0", "htmlparser2": "^10.0.0",
"yauzl": "^3.2.0", "yauzl": "^3.2.0",
@ -28,6 +28,9 @@
}, },
"devDependencies": { "devDependencies": {
"@types/node": "^24.1.0", "@types/node": "^24.1.0",
"csv-parse": "^6.1.0",
"csv-stringify": "^6.6.0",
"diff": "^8.0.3",
"typescript": "^5.9.3" "typescript": "^5.9.3"
} }
} }

33
pnpm-lock.yaml generated
View file

@ -17,9 +17,6 @@ importers:
duplexify: duplexify:
specifier: ^4.1.3 specifier: ^4.1.3
version: 4.1.3 version: 4.1.3
fp-ts:
specifier: ^2.16.11
version: 2.16.11
glob: glob:
specifier: ^13.0.0 specifier: ^13.0.0
version: 13.0.0 version: 13.0.0
@ -36,6 +33,15 @@ importers:
'@types/node': '@types/node':
specifier: ^24.1.0 specifier: ^24.1.0
version: 24.10.0 version: 24.10.0
csv-parse:
specifier: ^6.1.0
version: 6.1.0
csv-stringify:
specifier: ^6.6.0
version: 6.6.0
diff:
specifier: ^8.0.3
version: 8.0.3
typescript: typescript:
specifier: ^5.9.3 specifier: ^5.9.3
version: 5.9.3 version: 5.9.3
@ -62,6 +68,16 @@ packages:
buffer-crc32@0.2.13: buffer-crc32@0.2.13:
resolution: {integrity: sha512-VO9Ht/+p3SN7SKWqcrgEzjGbRSJYTx+Q1pTQC0wrWqHx0vpJraQ6GtHx8tvcg1rlK1byhU5gccxgOgj7B0TDkQ==} resolution: {integrity: sha512-VO9Ht/+p3SN7SKWqcrgEzjGbRSJYTx+Q1pTQC0wrWqHx0vpJraQ6GtHx8tvcg1rlK1byhU5gccxgOgj7B0TDkQ==}
csv-parse@6.1.0:
resolution: {integrity: sha512-CEE+jwpgLn+MmtCpVcPtiCZpVtB6Z2OKPTr34pycYYoL7sxdOkXDdQ4lRiw6ioC0q6BLqhc6cKweCVvral8yhw==}
csv-stringify@6.6.0:
resolution: {integrity: sha512-YW32lKOmIBgbxtu3g5SaiqWNwa/9ISQt2EcgOq0+RAIFufFp9is6tqNnKahqE5kuKvrnYAzs28r+s6pXJR8Vcw==}
diff@8.0.3:
resolution: {integrity: sha512-qejHi7bcSD4hQAZE0tNAawRK1ZtafHDmMTMkrrIGgSLl7hTnQHmKCeB45xAcbfTqK2zowkM3j3bHt/4b/ARbYQ==}
engines: {node: '>=0.3.1'}
dom-serializer@2.0.0: dom-serializer@2.0.0:
resolution: {integrity: sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==} resolution: {integrity: sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==}
@ -89,9 +105,6 @@ packages:
resolution: {integrity: sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g==} resolution: {integrity: sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g==}
engines: {node: '>=0.12'} engines: {node: '>=0.12'}
fp-ts@2.16.11:
resolution: {integrity: sha512-LaI+KaX2NFkfn1ZGHoKCmcfv7yrZsC3b8NtWsTVQeHkq4F27vI5igUuO53sxqDEa2gNQMHFPmpojDw/1zmUK7w==}
glob@13.0.0: glob@13.0.0:
resolution: {integrity: sha512-tvZgpqk6fz4BaNZ66ZsRaZnbHvP/jG3uKJvAZOwEVUL4RTA5nJeeLYfyN9/VA8NX/V3IBG+hkeuGpKjvELkVhA==} resolution: {integrity: sha512-tvZgpqk6fz4BaNZ66ZsRaZnbHvP/jG3uKJvAZOwEVUL4RTA5nJeeLYfyN9/VA8NX/V3IBG+hkeuGpKjvELkVhA==}
engines: {node: 20 || >=22} engines: {node: 20 || >=22}
@ -182,6 +195,12 @@ snapshots:
buffer-crc32@0.2.13: {} buffer-crc32@0.2.13: {}
csv-parse@6.1.0: {}
csv-stringify@6.6.0: {}
diff@8.0.3: {}
dom-serializer@2.0.0: dom-serializer@2.0.0:
dependencies: dependencies:
domelementtype: 2.3.0 domelementtype: 2.3.0
@ -215,8 +234,6 @@ snapshots:
entities@6.0.1: {} entities@6.0.1: {}
fp-ts@2.16.11: {}
glob@13.0.0: glob@13.0.0:
dependencies: dependencies:
minimatch: 10.1.1 minimatch: 10.1.1

216
summary.ts Normal file
View file

@ -0,0 +1,216 @@
import { type DatabaseSync, type SQLOutputValue } from "node:sqlite";
import { fileURLToPath } from "node:url";
import { stringify } from "csv-stringify/sync";
import { facebook } from "./data-export/facebook.ts";
import { execPaths, COLUMN_TYPES } from "./data-export/task.ts";
import { elapsed, loadTaskInNewDb } from "./main.ts";
const __filename = fileURLToPath(import.meta.url);
type ColumnMetaType = keyof typeof COLUMN_TYPES;
interface MetadataRow {
id: string;
perRowTags?: string;
columnMeta: ColumnMetaType[];
columnNames: string[];
}
// ── query helpers ─────────────────────────────────────────────────────────────
function q(name: string) {
return `"${name}"`;
}
/** Format a number compactly: integer if whole, otherwise 4 sig figs */
function fmt(v: number): string {
if (!isFinite(v)) return String(v);
if (Number.isInteger(v)) return String(v);
return v.toPrecision(4).replace(/\.?0+$/, '');
}
/** Non-null filter: treats SQL NULL, empty string, and the literal "null" as missing */
function notNull(col: string) {
return `${q(col)} IS NOT NULL AND ${q(col)} != '' AND ${q(col)} != 'null'`;
}
function rowCount(db: DatabaseSync, table: string): number {
return (db.prepare(`SELECT count(*) as n FROM ${q(table)}`).get() as { n: number }).n;
}
function datetimeRange(db: DatabaseSync, table: string, col: string): string {
const r = db.prepare(
`SELECT MIN(${q(col)}) as lo, MAX(${q(col)}) as hi FROM ${q(table)} WHERE ${notNull(col)}`
).get() as { lo: string | null; hi: string | null };
if (!r.lo) return '(no dates)';
// Trim to date portion if it looks like a full ISO datetime — keeps the line shorter
const trim = (s: string) => s.length > 10 && s[10] === 'T' ? s.slice(0, 10) : s;
return `${trim(r.lo)}..${trim(r.hi!)}`;
}
function numericRange(db: DatabaseSync, table: string, col: string): { lo: number; hi: number } | null {
const r = db.prepare(
`SELECT MIN(CAST(${q(col)} AS REAL)) as lo, MAX(CAST(${q(col)} AS REAL)) as hi
FROM ${q(table)} WHERE ${notNull(col)}`
).get() as { lo: number | null; hi: number | null };
return r.lo !== null ? { lo: r.lo, hi: r.hi! } : null;
}
function topValues(db: DatabaseSync, table: string, col: string, n: number): { distinct: number; top: { v: string; c: number }[] } {
const distinct = (db.prepare(
`SELECT count(distinct ${q(col)}) as d FROM ${q(table)} WHERE ${notNull(col)}`
).get() as { d: number }).d;
const top = db.prepare(
`SELECT ${q(col)} as v, count(*) as c FROM ${q(table)} WHERE ${notNull(col)}
GROUP BY ${q(col)} ORDER BY c DESC LIMIT ${n}`
).all() as { v: string; c: number }[];
return { distinct, top };
}
// ── metadata parsing (mirrors timelinize.ts) ──────────────────────────────────
function getColumnNames(db: DatabaseSync, tableName: string): string[] {
return db.prepare(`PRAGMA table_info(${q(tableName)})`).all().map(c => (c as any).name) as string[];
}
function parseMetadataRow(db: DatabaseSync, row: Record<string, SQLOutputValue>): MetadataRow | undefined {
const { id, perRowTags, columnMeta: columnMetaCSV } = row;
if (!id || typeof id !== 'string') return undefined;
const columnNames = getColumnNames(db, id);
// columnMeta may be absent for tables without type annotations — still useful to show
let columnMeta: ColumnMetaType[] = [];
if (columnMetaCSV && typeof columnMetaCSV === 'string') {
const parsed = columnMetaCSV.split(',') as ColumnMetaType[];
if (parsed.length === columnNames.length) {
columnMeta = parsed;
}
}
return {
id,
perRowTags: typeof perRowTags === 'string' ? perRowTags : undefined,
columnMeta,
columnNames,
};
}
/** Maps semantic type names → the actual column name in this table (first match wins) */
function metaToNames(meta: MetadataRow): Partial<Record<ColumnMetaType, string>> {
const out: Partial<Record<ColumnMetaType, string>> = {};
for (const [idx, colName] of meta.columnNames.entries()) {
const type = meta.columnMeta[idx];
if (!type || out[type]) continue; // skip untyped or already-seen types
out[type] = colName;
}
return out;
}
// ── table row builder ─────────────────────────────────────────────────────────
const TOP_SENDERS = 3;
const MAX_SENDER_LEN = 20;
interface SummaryRow {
type: string;
id: string;
n: string;
dates: string;
senders: string;
geo: string;
tags: string;
}
function buildSummaryRow(db: DatabaseSync, meta: MetadataRow): SummaryRow {
const { id, perRowTags } = meta;
const typeMap = metaToNames(meta);
const n = rowCount(db, id);
// ── shape label ─────────────────────────────────────────────────────────────
let type: string;
if (typeMap.sender && typeMap.isodatetime) type = 'chat';
else if (typeMap.isodatetime) type = 'time';
else type = 'static';
if (typeMap.lat && typeMap.lng) type += '+geo';
// ── datetime range ──────────────────────────────────────────────────────────
const dates = typeMap.isodatetime ? datetimeRange(db, id, typeMap.isodatetime) : '';
// ── sender info ─────────────────────────────────────────────────────────────
let senders = '';
if (typeMap.sender) {
const { distinct, top } = topValues(db, id, typeMap.sender, TOP_SENDERS);
const topStr = top
.map(r => {
const name = r.v.length > MAX_SENDER_LEN ? r.v.slice(0, MAX_SENDER_LEN - 1) + '…' : r.v;
return `${name}×${r.c}`;
})
.join(', ');
senders = `${distinct} [${topStr}]`;
}
// ── geo ranges ──────────────────────────────────────────────────────────────
let geo = '';
const latR = typeMap.lat ? numericRange(db, id, typeMap.lat) : null;
const lngR = typeMap.lng ? numericRange(db, id, typeMap.lng) : null;
if (latR && lngR) {
geo = `[${fmt(latR.lo)}..${fmt(latR.hi)}] [${fmt(lngR.lo)}..${fmt(lngR.hi)}]`;
} else if (latR) {
geo = `lat=[${fmt(latR.lo)}..${fmt(latR.hi)}]`;
}
return { type, id, n: String(n), dates, senders, geo, tags: perRowTags ?? '' };
}
// ── CSV output ────────────────────────────────────────────────────────────────
const COLUMNS: { key: keyof SummaryRow; header: string }[] = [
{ key: 'type', header: 'type' },
{ key: 'id', header: 'id' },
{ key: 'n', header: 'n' },
{ key: 'dates', header: 'dates' },
{ key: 'senders', header: 'senders' },
{ key: 'geo', header: 'lat / lng' },
{ key: 'tags', header: 'tags' },
];
function printCSV(summaryRows: SummaryRow[]) {
const records = [
COLUMNS.map(c => c.header),
...summaryRows.map(row => COLUMNS.map(c => row[c.key])),
];
process.stdout.write(stringify(records));
}
// ── main ─────────────────────────────────────────────────────────────────────
async function main() {
process.stderr.write(`${elapsed()} - Building targets\n`);
const targets = await execPaths([
{path: "/home/cobertos/Seafile/archive/ExportedServiceData/facebook/formapcast_facebook-DEADNAME-May2021-json", op: facebook()}
// {path: "/home/cobertos/Seafile/archive/ExportedServiceData/fitbit/FullHumanName", op: fitbit()}
]);
process.stderr.write(`${elapsed()} - Found ${targets.filter(t => !t.aggregate).length} possible targets\n`);
const db = await loadTaskInNewDb(targets);
const rows = db.prepare(
`SELECT id, perRowTags, columnMeta FROM base_data_manager_metadata ORDER BY id`
).all() as Record<string, SQLOutputValue>[];
const summaryRows: SummaryRow[] = [];
for (const row of rows) {
const meta = parseMetadataRow(db, row);
if (!meta) continue;
summaryRows.push(buildSummaryRow(db, meta));
}
printCSV(summaryRows);
db.close();
}
if (process.argv[1] === __filename) {
main();
}

93
test/data-export.ts Normal file
View file

@ -0,0 +1,93 @@
import test from "node:test";
import nodePath from "node:path";
import fs from "node:fs/promises";
import { strict as assert } from "node:assert";
import { unzip, pipe, execPaths, type PipelineOp } from "../data-export/task.ts";
import * as DataIO from "../data-export/io.ts";
import { assertCSVWellFormed } from "./utils/csvUtils.ts";
import { assertStringEq, ptry } from "./utils/general.ts";
import { facebook, facebook_v2 } from "../data-export/facebook.ts";
import { discord } from "../data-export/discord.ts";
import { snapchat } from "../data-export/snapchat.ts";
import { discord_chat_exporter } from "../data-export/discord-chat-exporter.ts";
import { fitbit } from "../data-export/fitbit.ts";
const THIS_FILE = import.meta.dirname;
const SNAPSHOT_DIR = nodePath.join(THIS_FILE, 'snapshots');
const updateSnapshots = process.execArgv.includes("--test-update-snapshots");
/**Custom version of t.snapshot
* * We save each csv id to it's own file (regardless of where it came from)
* * Properly handles \r\n which nodejs's t.snapshot gets rid of because of
* how it encodes it into backticks/template literals
*/
async function snapshotCSV(id: string, csv: string) {
const snapshotFilePath = nodePath.join(SNAPSHOT_DIR, id) + "snapshot.csv";
if (updateSnapshots) {
// Update the snapshots, do no checking, our internal csv is the source of truth
await fs.writeFile(snapshotFilePath, csv, { encoding: "utf8" });
}
else {
const [err, prevCSV] = await ptry(fs.readFile(snapshotFilePath, { encoding: "utf8" }));
assert(!err, `Snapshot file '${snapshotFilePath}' did not exist. Perhaps you need to update snapshots, "--test-update-snapshots"?`);
assertStringEq(csv, prevCSV, "csv and snapshot csv should be the same");
}
}
async function testPipelineOp(path: string, op: PipelineOp, overwriteIdPrefix?: string) {
const targets = await execPaths([{ path, op }]);
const out = await DataIO.runPipeline(targets);
const idAndCSVs: [string, string][] = [];
// Verify and collect all the id + csv tuples
for (const {target, result} of out) {
const id = target.id;
// Check the result for success
assert.ok(!result.stderr, `Task ${id} should have no stderr output`);
assert.ok(result.ok, `Task ${id} should be okay`);
// Check the CSV itself for correctness
const csv = result.stdout;
assertCSVWellFormed(csv, `${csv}\nTask ${id} should have well-formed csv.`);
idAndCSVs.push([target.id, csv]);
}
// Everything is verified for cleanliness coming out of the current run, verify
// against the snapshots + save if we're updating snapshots
const idPrefix = overwriteIdPrefix ?? path.split("/").pop(); // Make unique with the last name of the path
await Promise.all(idAndCSVs.map(([id, csv])=>snapshotCSV(`${idPrefix}_${id}`, csv)));
}
test("facebook: Can load the 2021-01 export", async () => {
const path = nodePath.join(THIS_FILE, 'fixtures/facebook-json-2021-05-01');
await testPipelineOp(path, facebook());
});
test("facebook: Can load the 2021-01 export zipped", async () => {
const path = nodePath.join(THIS_FILE, 'fixtures/facebook-json-2021-05-01.zip');
await testPipelineOp(path, pipe(unzip(), facebook()));
});
test("facebook: Can load the 2025-11 export", async () => {
const path = nodePath.join(THIS_FILE, 'fixtures/facebook-json-2025-11-29');
await testPipelineOp(path, facebook_v2());
});
test("discord: Can load the 2021-05 export", async () => {
const path = nodePath.join(THIS_FILE, 'fixtures/discord-json-2021-01');
await testPipelineOp(path, discord());
});
test("snapchat: Can load the 2023-11 export", async () => {
const path = nodePath.join(THIS_FILE, 'fixtures/snapchat-2023-11');
await testPipelineOp(path, snapchat());
});
test("discord-chat-exporter: Can load the 2026-02 export", async () => {
const path = nodePath.join(THIS_FILE, 'fixtures/discord-chat-exporter-2026-02');
await testPipelineOp(path, discord_chat_exporter());
});
test("fitbit: Can load the 2026-02 export", async () => {
const path = nodePath.join(THIS_FILE, 'fixtures/fitbit-2026-02/FullHumanName');
await testPipelineOp(path, fitbit(), "fitbit-2026-02");
});

View file

@ -1,73 +0,0 @@
import test from "node:test";
import nodePath from "node:path";
import { strict as assert } from "node:assert";
import { finished } from "node:stream/promises";
import { Readable, Writable } from "node:stream";
import { TaskTargetPipelineHelper, TaskTarget, verify, getTSVManifest, getTaskManifest, run } from "../data-export/task.ts";
import { parallel } from "../data-export/parallel.ts";
import "../data-export/facebook.ts";
const THIS_FILE = import.meta.dirname;
const FACEBOOK_V1_DIR = nodePath.join(THIS_FILE, 'fixtures/facebook-json-2021-05-01');
const FACEBOOK_V1_ZIPPED = nodePath.join(THIS_FILE, 'fixtures/facebook-json-2021-05-01.zip');
const FACEBOOK_V2_DIR = nodePath.join(THIS_FILE, 'fixtures/facebook-json-2025-11-29');
test("facebook: Can load the 2021 export", async (t) => {
const targets = TaskTargetPipelineHelper.pipeline([
new TaskTarget(FACEBOOK_V1_DIR)
])
.facebook();
const finalTargets = await verify(targets);
const result = await parallel(finalTargets, true);
for (const [id, r] of result.entries()) {
assert.ok(!r.stderr, `Task ${id} should have no stderr output`);
assert.ok(r.ok, `Task ${id} should be okay`);
}
const allCSV = Array.from(result.entries())
.sort() // Keep stable ordering for snapshots
.map(([id, r]) => r.stdout);
t.assert.snapshot(allCSV);
});
test("facebook: Can load the 2021 export zipped", async (t) => {
const targets = await TaskTargetPipelineHelper.pipeline([
new TaskTarget(FACEBOOK_V1_ZIPPED)
])
.unzip();
const targets2 = targets
.facebook();
const finalTargets = await verify(targets2);
const result = await parallel(finalTargets, true);
for (const [id, r] of result.entries()) {
assert.ok(!r.stderr, `Task ${id} should have no stderr output`);
assert.ok(r.ok, `Task ${id} should be okay`);
}
const allCSV = Array.from(result.entries())
.sort() // Keep stable ordering for snapshots
.map(([id, r]) => r.stdout);
t.assert.snapshot(allCSV);
});
test("facebook: Can load the 2025 export", async (t) => {
const targets = TaskTargetPipelineHelper.pipeline([
new TaskTarget(FACEBOOK_V2_DIR)
])
.facebook_v2();
const finalTargets = await verify(targets);
const result = await parallel(finalTargets, true);
for (const [id, r] of result.entries()) {
assert.ok(!r.stderr, `Task ${id} should have no stderr output`);
assert.ok(r.ok, `Task ${id} should be okay`);
}
const allCSV = Array.from(result.entries())
.sort() // Keep stable ordering for snapshots
.map(([id, r]) => r.stdout);
t.assert.snapshot(allCSV);
});

View file

@ -1,116 +0,0 @@
exports[`facebook: Can load the 2021 export 1`] = `
[
"\\"album\\",\\"uri\\",\\"creation_timestamp\\"\\n\\"xxx\\",\\"photos_and_videos/CoverPhotos_yyyyyy/200x200png.png\\",\\"2024-03-07T15:23:20Z\\"\\n\\"xxx\\",\\"photos_and_videos/CoverPhotos_yyyyyy/200x200png.png\\",\\"2024-07-01T07:46:40Z\\"\\n",
"[\\n \\"from\\",\\n \\"to\\",\\n \\"timestamp\\",\\n \\"body\\"\\n]\\n\\"Me\\",\\"xxx\\",\\"2024-01-13T07:13:20Z\\",\\"xxx\\"\\n\\"Me\\",\\"xxx\\",\\"2024-01-13T07:13:20Z\\",\\"xxx\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n",
"\\"action\\",\\"ip\\",\\"user_agent\\",\\"datr_cookie\\",\\"city\\",\\"region\\",\\"country\\",\\"site_name\\",\\"timestamp\\"\\n\\"xxx\\",\\"1.1.1.1\\",\\"some/path\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n\\"xxx\\",\\"1.1.1.1\\",\\"some/path\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n",
"\\"status\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n\\"xxx\\",\\"2024-02-13T14:36:40Z\\"\\n",
"\\"service_name\\",\\"native_app_id\\",\\"username\\",\\"email\\",\\"phone_number\\",\\"name\\"\\n\\"xxx\\",69,\\"xxx\\",\\"not_a_real_email@example.com\\",\\"xxx\\",\\"xxx\\"\\n\\"xxx\\",1707005000,\\"xxx\\",\\"not_a_real_email@example.com\\",,\\"xxx\\"\\n",
"\\"event\\",\\"created_timestamp\\",\\"ip_address\\",\\"user_agent\\",\\"datr_cookie\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\",,,\\n\\"xxx\\",\\"2024-02-13T14:36:40Z\\",,,\\n",
"\\"name\\",\\"added_timestamp\\"\\n\\"xxx\\",\\"2024-12-29T08:13:20Z\\"\\n\\"xxx\\",\\"2024-09-02T12:26:40Z\\"\\n",
"\\"name\\",\\"created_timestamp\\",\\"updated_timestamp\\",\\"ip_address\\",\\"user_agent\\",\\"location\\",\\"app\\",\\"session_type\\",\\"datr_cookie\\"\\n\\"xxx\\",\\"2024-08-22T01:26:40Z\\",\\"2024-05-11T15:06:40Z\\",\\"1.1.1.1\\",\\"some/path\\",\\"\\",\\"\\",\\"\\",\\"xxx\\"\\n",
"\\"timestamp\\",\\"data\\",\\"title\\"\\n\\"2024-02-08T19:20:00Z\\",\\"TODO\\",\\"xxx\\"\\n\\"2024-01-17T14:00:00Z\\",\\"TODO\\",\\"xxx\\"\\n",
"\\"timestamp\\",\\"email\\",\\"contact_type\\"\\n\\"2024-10-18T07:03:20Z\\",\\"not_a_real_email@example.com\\",69\\n\\"2024-01-21T22:10:00Z\\",\\"not_a_real_email@example.com\\",69\\n",
"\\"name\\"\\n\\"xxx\\"\\n\\"xxx\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-02-13T13:13:20Z\\"\\n\\"xxx\\",\\"2024-10-31T00:36:40Z\\"\\n",
"\\"game\\",\\"added_timestamp\\"\\n\\"xxx\\",\\"2024-11-03T16:06:40Z\\"\\n",
"\\"title\\",\\"price\\",\\"seller\\",\\"created_timestamp\\",\\"latitude\\",\\"longitude\\",\\"description\\"\\n\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-12-18T05:33:20Z\\",69,69,\\"xxx\\"\\n\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-12-18T05:33:20Z\\",69,69,\\"xxx\\"\\n",
"\\"action\\",\\"timestamp\\",\\"site\\",\\"ip_address\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\",\\"xxx\\",\\"1.1.1.1\\"\\n\\"xxx\\",\\"2024-04-23T17:56:40Z\\",\\"xxx\\",\\"1.1.1.1\\"\\n",
"\\"timestamp\\",\\"unread\\",\\"href\\",\\"text\\"\\n\\"2024-04-30T08:16:40Z\\",true,\\"url://somewhere\\",\\"xxx\\"\\n\\"2024-04-30T08:16:40Z\\",true,\\"url://somewhere\\",\\"xxx\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n",
"\\"from\\",\\"to\\",\\"amount\\",\\"currency\\",\\"type\\",\\"status\\",\\"payment_method\\",\\"created_timestamp\\"\\n\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-05-05T21:36:40Z\\"\\n",
"\\"name\\",\\"uri\\",\\"timestamp\\"\\n\\"xxx\\",\\"url://somewhere\\",\\"2024-01-15T12:00:00Z\\"\\n\\"xxx\\",\\"url://somewhere\\",\\"2024-01-12T06:13:20Z\\"\\n",
"\\"from\\",\\"to\\",\\"rank\\",\\"timestamp\\"\\n\\"xxx\\",\\"xxx\\",69,\\"2024-07-22T19:03:20Z\\"\\n",
"\\"title\\",\\"timestamp\\",\\"reaction\\"\\n,\\"2024-01-14T06:50:00Z\\",\\"xxx\\"\\n,\\"2024-01-14T06:50:00Z\\",\\"xxx\\"\\n",
"\\"title\\",\\"timestamp\\"\\n,\\"2024-10-06T08:56:40Z\\"\\n,\\"2024-10-06T08:56:40Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-02-08T16:33:20Z\\"\\n\\"xxx\\",\\"2024-09-24T19:10:00Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-09-27T15:13:20Z\\"\\n\\"xxx\\",\\"2024-08-24T00:40:00Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-01-14T06:50:00Z\\"\\n\\"xxx\\",\\"2024-01-14T06:50:00Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-06-23T05:20:00Z\\"\\n\\"xxx\\",\\"2024-05-25T08:16:40Z\\"\\n",
"\\"title\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-01-14T06:50:00Z\\"\\n\\"xxx\\",\\"2024-04-28T20:10:00Z\\"\\n",
"\\"from\\",\\"to\\",\\"subject\\",\\"message\\",\\"timestamp\\"\\n\\"not_a_real_email@example.com\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-10-16T06:26:40Z\\"\\n\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"url://somewhere\\",\\"2024-10-16T06:26:40Z\\"\\n",
"\\"title\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-12-17T08:43:20Z\\"\\n",
"\\"title\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-01-14T06:50:00Z\\"\\n\\"xxx\\",\\"2024-01-14T06:50:00Z\\"\\n",
"\\"name\\",\\"id\\",\\"type\\",\\"timestamp\\"\\n\\"xxx\\",69,\\"xxx\\",\\"2024-02-11T12:36:40Z\\"\\n\\"xxx\\",69,\\"xxx\\",\\"2024-02-10T19:56:40Z\\"\\n\\"xxx\\",69,\\"xxx\\",\\"2024-02-10T11:36:40Z\\"\\n\\"xxx\\",69,\\"xxx\\",\\"2024-02-07T21:06:40Z\\"\\n",
"\\"name\\",\\"uri\\",\\"timestamp\\"\\n\\"xxx\\",\\"url://somewhere\\",\\"2024-02-27T05:00:00Z\\"\\n\\"xxx\\",\\"url://somewhere\\",\\"2024-05-16T03:26:40Z\\"\\n",
"\\"title\\",\\"data\\",\\"timestamp\\"\\n\\"xxx\\",\\"TODO: data\\",\\"2024-05-01T07:53:20Z\\"\\n\\"xxx\\",\\"TODO: data\\",\\"2024-10-31T06:10:00Z\\"\\n",
"\\"title\\",\\"data\\",\\"timestamp\\"\\n\\"xxx\\",\\"TODO\\",\\"2024-02-08T19:20:00Z\\"\\n\\"xxx\\",\\"TODO\\",\\"2024-02-08T19:20:00Z\\"\\n",
"\\"title\\",\\"data\\",\\"timestamp\\"\\n\\"xxx\\",\\"xxx\\",\\"2024-11-17T06:30:00Z\\"\\n\\"xxx\\",\\"xxx\\",\\"2024-11-17T06:30:00Z\\"\\n"
]
`;
exports[`facebook: Can load the 2021 export zipped 1`] = `
[
"\\"album\\",\\"uri\\",\\"creation_timestamp\\"\\n\\"xxx\\",\\"photos_and_videos/CoverPhotos_yyyyyy/200x200png.png\\",\\"2024-03-07T15:23:20Z\\"\\n\\"xxx\\",\\"photos_and_videos/CoverPhotos_yyyyyy/200x200png.png\\",\\"2024-07-01T07:46:40Z\\"\\n",
"[\\n \\"from\\",\\n \\"to\\",\\n \\"timestamp\\",\\n \\"body\\"\\n]\\n\\"Me\\",\\"xxx\\",\\"2024-01-13T07:13:20Z\\",\\"xxx\\"\\n\\"Me\\",\\"xxx\\",\\"2024-01-13T07:13:20Z\\",\\"xxx\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n",
"\\"action\\",\\"ip\\",\\"user_agent\\",\\"datr_cookie\\",\\"city\\",\\"region\\",\\"country\\",\\"site_name\\",\\"timestamp\\"\\n\\"xxx\\",\\"1.1.1.1\\",\\"some/path\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n\\"xxx\\",\\"1.1.1.1\\",\\"some/path\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n",
"\\"status\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n\\"xxx\\",\\"2024-02-13T14:36:40Z\\"\\n",
"\\"service_name\\",\\"native_app_id\\",\\"username\\",\\"email\\",\\"phone_number\\",\\"name\\"\\n\\"xxx\\",69,\\"xxx\\",\\"not_a_real_email@example.com\\",\\"xxx\\",\\"xxx\\"\\n\\"xxx\\",1707005000,\\"xxx\\",\\"not_a_real_email@example.com\\",,\\"xxx\\"\\n",
"\\"event\\",\\"created_timestamp\\",\\"ip_address\\",\\"user_agent\\",\\"datr_cookie\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\",,,\\n\\"xxx\\",\\"2024-02-13T14:36:40Z\\",,,\\n",
"\\"name\\",\\"added_timestamp\\"\\n\\"xxx\\",\\"2024-12-29T08:13:20Z\\"\\n\\"xxx\\",\\"2024-09-02T12:26:40Z\\"\\n",
"\\"name\\",\\"created_timestamp\\",\\"updated_timestamp\\",\\"ip_address\\",\\"user_agent\\",\\"location\\",\\"app\\",\\"session_type\\",\\"datr_cookie\\"\\n\\"xxx\\",\\"2024-08-22T01:26:40Z\\",\\"2024-05-11T15:06:40Z\\",\\"1.1.1.1\\",\\"some/path\\",\\"\\",\\"\\",\\"\\",\\"xxx\\"\\n",
"\\"timestamp\\",\\"data\\",\\"title\\"\\n\\"2024-02-08T19:20:00Z\\",\\"TODO\\",\\"xxx\\"\\n\\"2024-01-17T14:00:00Z\\",\\"TODO\\",\\"xxx\\"\\n",
"\\"timestamp\\",\\"email\\",\\"contact_type\\"\\n\\"2024-10-18T07:03:20Z\\",\\"not_a_real_email@example.com\\",69\\n\\"2024-01-21T22:10:00Z\\",\\"not_a_real_email@example.com\\",69\\n",
"\\"name\\"\\n\\"xxx\\"\\n\\"xxx\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-02-13T13:13:20Z\\"\\n\\"xxx\\",\\"2024-10-31T00:36:40Z\\"\\n",
"\\"game\\",\\"added_timestamp\\"\\n\\"xxx\\",\\"2024-11-03T16:06:40Z\\"\\n",
"\\"title\\",\\"price\\",\\"seller\\",\\"created_timestamp\\",\\"latitude\\",\\"longitude\\",\\"description\\"\\n\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-12-18T05:33:20Z\\",69,69,\\"xxx\\"\\n\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-12-18T05:33:20Z\\",69,69,\\"xxx\\"\\n",
"\\"action\\",\\"timestamp\\",\\"site\\",\\"ip_address\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\",\\"xxx\\",\\"1.1.1.1\\"\\n\\"xxx\\",\\"2024-04-23T17:56:40Z\\",\\"xxx\\",\\"1.1.1.1\\"\\n",
"\\"timestamp\\",\\"unread\\",\\"href\\",\\"text\\"\\n\\"2024-04-30T08:16:40Z\\",true,\\"url://somewhere\\",\\"xxx\\"\\n\\"2024-04-30T08:16:40Z\\",true,\\"url://somewhere\\",\\"xxx\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n\\"xxx\\",\\"2024-05-01T07:53:20Z\\"\\n",
"\\"from\\",\\"to\\",\\"amount\\",\\"currency\\",\\"type\\",\\"status\\",\\"payment_method\\",\\"created_timestamp\\"\\n\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-05-05T21:36:40Z\\"\\n",
"\\"name\\",\\"uri\\",\\"timestamp\\"\\n\\"xxx\\",\\"url://somewhere\\",\\"2024-01-15T12:00:00Z\\"\\n\\"xxx\\",\\"url://somewhere\\",\\"2024-01-12T06:13:20Z\\"\\n",
"\\"from\\",\\"to\\",\\"rank\\",\\"timestamp\\"\\n\\"xxx\\",\\"xxx\\",69,\\"2024-07-22T19:03:20Z\\"\\n",
"\\"title\\",\\"timestamp\\",\\"reaction\\"\\n,\\"2024-01-14T06:50:00Z\\",\\"xxx\\"\\n,\\"2024-01-14T06:50:00Z\\",\\"xxx\\"\\n",
"\\"title\\",\\"timestamp\\"\\n,\\"2024-10-06T08:56:40Z\\"\\n,\\"2024-10-06T08:56:40Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-02-08T16:33:20Z\\"\\n\\"xxx\\",\\"2024-09-24T19:10:00Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-09-27T15:13:20Z\\"\\n\\"xxx\\",\\"2024-08-24T00:40:00Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-01-14T06:50:00Z\\"\\n\\"xxx\\",\\"2024-01-14T06:50:00Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-06-23T05:20:00Z\\"\\n\\"xxx\\",\\"2024-05-25T08:16:40Z\\"\\n",
"\\"title\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-01-14T06:50:00Z\\"\\n\\"xxx\\",\\"2024-04-28T20:10:00Z\\"\\n",
"\\"from\\",\\"to\\",\\"subject\\",\\"message\\",\\"timestamp\\"\\n\\"not_a_real_email@example.com\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-10-16T06:26:40Z\\"\\n\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"url://somewhere\\",\\"2024-10-16T06:26:40Z\\"\\n",
"\\"title\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-12-17T08:43:20Z\\"\\n",
"\\"title\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-01-14T06:50:00Z\\"\\n\\"xxx\\",\\"2024-01-14T06:50:00Z\\"\\n",
"\\"name\\",\\"id\\",\\"type\\",\\"timestamp\\"\\n\\"xxx\\",69,\\"xxx\\",\\"2024-02-11T12:36:40Z\\"\\n\\"xxx\\",69,\\"xxx\\",\\"2024-02-10T19:56:40Z\\"\\n\\"xxx\\",69,\\"xxx\\",\\"2024-02-10T11:36:40Z\\"\\n\\"xxx\\",69,\\"xxx\\",\\"2024-02-07T21:06:40Z\\"\\n",
"\\"name\\",\\"uri\\",\\"timestamp\\"\\n\\"xxx\\",\\"url://somewhere\\",\\"2024-02-27T05:00:00Z\\"\\n\\"xxx\\",\\"url://somewhere\\",\\"2024-05-16T03:26:40Z\\"\\n",
"\\"title\\",\\"data\\",\\"timestamp\\"\\n\\"xxx\\",\\"TODO: data\\",\\"2024-05-01T07:53:20Z\\"\\n\\"xxx\\",\\"TODO: data\\",\\"2024-10-31T06:10:00Z\\"\\n",
"\\"title\\",\\"data\\",\\"timestamp\\"\\n\\"xxx\\",\\"TODO\\",\\"2024-02-08T19:20:00Z\\"\\n\\"xxx\\",\\"TODO\\",\\"2024-02-08T19:20:00Z\\"\\n",
"\\"title\\",\\"data\\",\\"timestamp\\"\\n\\"xxx\\",\\"xxx\\",\\"2024-11-17T06:30:00Z\\"\\n\\"xxx\\",\\"xxx\\",\\"2024-11-17T06:30:00Z\\"\\n"
]
`;
exports[`facebook: Can load the 2025 export 1`] = `
[
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"some/path\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\"xxx\\"\\n",
"\\"from\\",\\"to\\",\\"timestamp\\",\\"content\\"\\n\\"xxx\\",\\"<other>\\",\\"1970-01-01T00:00:00Z\\",\\n",
"\\"action\\",\\"ip\\",\\"user_agent\\",\\"datr_cookie\\",\\"city\\",\\"region\\",\\"country\\",\\"site_name\\",\\"timestamp\\"\\n\\"xxx\\",\\"1.1.1.1\\",\\"some/path\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-11-22T10:06:40Z\\"\\n\\"xxx\\",\\"1.1.1.1\\",\\"some/path\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-11-21T23:00:00Z\\"\\n",
"\\"timestamp\\",\\"data\\",\\"title\\"\\n\\"2024-02-13T02:06:40Z\\",\\"TODO\\",\\"xxx\\"\\n\\"2024-07-12T02:06:40Z\\",\\"TODO\\",\\"xxx\\"\\n",
"\\"name\\",\\"added_timestamp\\"\\n\\"xxx\\",\\"2024-01-12T00:40:00Z\\"\\n\\"xxx\\",\\"2024-06-21T17:13:20Z\\"\\n",
"\\"timestamp\\",\\"email\\",\\"contact_type\\"\\n\\"2024-02-07T19:43:20Z\\",\\"not_a_real_email@example.com\\",69\\n",
"\\"title\\",\\"data\\",\\"timestamp\\"\\n\\"xxx\\",\\"TODO\\",\\"2024-10-06T06:10:00Z\\"\\n\\"xxx\\",\\"TODO\\",\\"2024-01-22T16:13:20Z\\"\\n",
"\\"title\\",\\"price\\",\\"seller\\",\\"created_timestamp\\",\\"latitude\\",\\"longitude\\",\\"description\\"\\n\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-10-02T23:00:00Z\\",69,69,\\"xxx\\"\\n\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"2024-09-27T01:20:00Z\\",69,69,\\"xxx\\"\\n",
"\\"action\\",\\"timestamp\\",\\"site\\",\\"ip_address\\"\\n\\"xxx\\",\\"2024-08-10T14:26:40Z\\",\\"xxx\\",\\"1.1.1.1\\"\\n\\"xxx\\",\\"2024-08-10T14:26:40Z\\",\\"xxx\\",\\"1.1.1.1\\"\\n",
"\\"timestamp\\",\\"unread\\",\\"href\\",\\"text\\"\\n\\"2024-11-20T12:16:40Z\\",true,\\"url://somewhere\\",\\"xxx\\"\\n\\"2024-11-15T00:20:00Z\\",true,\\"url://somewhere\\",\\"xxx\\"\\n",
"\\"title\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-02-21T03:10:00Z\\"\\n",
"\\"name\\",\\"uri\\",\\"timestamp\\"\\n\\"xxx\\",\\"url://somewhere\\",\\"2024-09-11T20:03:20Z\\"\\n\\"xxx\\",\\"url://somewhere\\",\\"2024-01-20T12:50:00Z\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-09-10T10:43:20Z\\"\\n\\"xxx\\",\\"2024-09-02T12:26:40Z\\"\\n",
"\\"event\\",\\"created_timestamp\\",\\"ip_address\\",\\"user_agent\\",\\"datr_cookie\\"\\n\\"xxx\\",\\"2024-08-11T01:33:20Z\\",,,\\n\\"xxx\\",\\"2024-08-10T14:26:40Z\\",,,\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-09-01T14:13:20Z\\"\\n\\"xxx\\",\\"2024-08-12T08:06:40Z\\"\\n",
"\\"start\\",\\"end\\"\\n",
"\\"name\\",\\"created_timestamp\\",\\"updated_timestamp\\",\\"ip_address\\",\\"user_agent\\",\\"location\\",\\"app\\",\\"session_type\\",\\"datr_cookie\\"\\n,\\"2024-04-04T19:46:40Z\\",\\"2024-11-23T02:46:40Z\\",\\"1.1.1.1\\",\\"some/path\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\"\\n,\\"2024-04-05T06:53:20Z\\",\\"2024-11-22T10:06:40Z\\",\\"1.1.1.1\\",\\"some/path\\",\\"xxx\\",\\"xxx\\",\\"xxx\\",\\"xxx\\"\\n",
"\\"name\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-04-01T16:46:40Z\\"\\n\\"xxx\\",\\"2024-09-07T16:03:20Z\\"\\n",
"\\"title\\",\\"timestamp\\"\\n\\"xxx\\",\\"2024-02-12T17:46:40Z\\"\\n\\"xxx\\",\\"2024-02-12T17:46:40Z\\"\\n",
"\\"title\\",\\"data\\",\\"timestamp\\"\\n\\"xxx\\",\\"xxx\\",\\"2024-12-08T09:26:40Z\\"\\n\\"xxx\\",\\"xxx\\",\\"2024-12-28T00:16:40Z\\"\\n"
]
`;

View file

@ -11,3 +11,6 @@
* `facebook-json-2021-05-01` - Facebook JSON export * `facebook-json-2021-05-01` - Facebook JSON export
* `facebook-json-2025-11-29` - Facebook JSON export * `facebook-json-2025-11-29` - Facebook JSON export
* [`discord-chat-exporter-2026-02`](./discord-chat-exporter-2026-02.md) - Discord export with [DiscordChatExporter](https://github.com/Tyrrrz/DiscordChatExporter) sometime around Feb 2026
* [`discord-json-2021-01`](./discord-json-2021-01.md) - Discord JSON export
* [`snapchat-2023-11`](./snapchat-2023-11.md) - Snapchat JSON + HTML export

View file

@ -0,0 +1,25 @@
# discord-chat-exporter-2026-02
An export from `DiscordChatExporter`, a comprehensive DiscordChatExporter
## Export methodology
This uses the version of `DiscordChatExporter` that existed at the top of the releases tab on GitHub around `2026 February`. **TODO: figure out version**
This export used a command something like the following to try to get _everything_ `dotnet DiscordChatExporter.Cli.dll export -t xxx -o ~/DiscordChatExporter -f json --media --reuse-media --include-threads -c xxx`
* It uses `export` command and `-c` but it's the same for `exportguild` and `-g`
* `-f json` so only the json export
* `--media` download all media
* `--reuse-media` not quite sure what this does because it puts it in a folder per channel...
* `--include-threads` to get any threads
## Manual edits
* Lots of image replacing + placeholders
* Had to rename the folders
## Notes
The export format has files and folders with similar, information-dense names. I tried to preserve that as that's the only way to correlate between the folder and the file name
* No exif on any media files
* There's embeds, thumbnails in the example chat messages but I have no other specimen

View file

@ -0,0 +1,145 @@
{
"guild": {
"id": "111111111111111111",
"name": "xxxxxxxx",
"iconUrl": "GuildName - Text Channels - ChannelName [0000000000000000].json_Files/avatar.png"
},
"channel": {
"id": "111111111111111111",
"type": "xxxxxxxxxxxxx",
"categoryId": "111111111111111111",
"category": "xxxxxxxxxxxxx",
"name": "xxxxxxx",
"topic": null
},
"dateRange": {
"after": null,
"before": null
},
"exportedAt": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"messages": [
{
"id": "111111111111111111",
"type": "xxxxxxxxxxxxxxx",
"timestamp": "2020-04-13T10:09:08.000000+00:00",
"timestampEdited": null,
"callEndedTimestamp": null,
"isPinned": false,
"content": "xxxxxxxxxxxxxxxxxx",
"author": {
"id": "111111111111111111",
"name": "xxxxxxxx",
"discriminator": "1111",
"nickname": "xxxxxxxx",
"color": null,
"isBot": false,
"roles": [],
"avatarUrl": "GuildName - Text Channels - ChannelName [0000000000000000].json_Files/avatar.png"
},
"attachments": [],
"embeds": [],
"stickers": [],
"reactions": [],
"mentions": [],
"inlineEmojis": []
},
{
"id": "111111111111111111",
"type": "xxxxxxx",
"timestamp": "2020-04-13T10:09:08.000000+00:00",
"timestampEdited": null,
"callEndedTimestamp": null,
"isPinned": false,
"content": "xxxxxxxxx",
"author": {
"id": "111111111111111111",
"name": "xxxxxxxx",
"discriminator": "1111",
"nickname": "xxxxxxxx",
"color": null,
"isBot": false,
"roles": [],
"avatarUrl": "GuildName - Text Channels - ChannelName [0000000000000000].json_Files/avatar.png"
},
"attachments": [],
"embeds": [],
"stickers": [],
"reactions": [],
"mentions": [],
"inlineEmojis": []
},
{
"id": "111111111111111111",
"type": "xxxxxxx",
"timestamp": "2020-04-13T10:09:08.000000+00:00",
"timestampEdited": null,
"callEndedTimestamp": null,
"isPinned": false,
"content": "https://example.com/example.png",
"author": {
"id": "111111111111111111",
"name": "xxxxxxxx",
"discriminator": "1111",
"nickname": "xxxxxxxx",
"color": null,
"isBot": false,
"roles": [],
"avatarUrl": "GuildName - Text Channels - ChannelName [0000000000000000].json_Files/avatar.png"
},
"attachments": [],
"embeds": [
{
"title": "",
"url": "https://example.com/example.png",
"timestamp": null,
"description": "",
"thumbnail": {
"url": "GuildName - Text Channels - ChannelName [0000000000000000].json_Files/example.png",
"width": 111,
"height": 111
},
"images": [],
"fields": [],
"inlineEmojis": []
}
],
"stickers": [],
"reactions": [],
"mentions": [],
"inlineEmojis": []
},
{
"id": "111111111111111111",
"type": "xxxxxxx",
"timestamp": "2020-04-13T10:09:08.000000+00:00",
"timestampEdited": null,
"callEndedTimestamp": null,
"isPinned": false,
"content": "xxx",
"author": {
"id": "111111111111111111",
"name": "xxxxxxxx",
"discriminator": "1111",
"nickname": "xxxxxxxx",
"color": null,
"isBot": false,
"roles": [],
"avatarUrl": "GuildName - Text Channels - ChannelName [0000000000000000].json_Files/avatar.png"
},
"attachments": [
{
"id": "111111111111111111",
"url": "GuildName - Text Channels - ChannelName [0000000000000000].json_Files/unknown-SUFFIX.png",
"fileName": "unknown.png",
"fileSizeBytes": 111111
}
],
"embeds": [],
"stickers": [],
"reactions": [],
"mentions": [],
"inlineEmojis": []
}
],
"messageCount": 111
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 KiB

41
test/fixtures/discord-json-2021-01.md vendored Normal file
View file

@ -0,0 +1,41 @@
# discord-json-2021-01
## Manual edits
* images -> placeholders
* `accounts/avatar.png`
* manually scrub folder names
* `account/applications/0000000000000`
## Notes about files
* `activity/`
* All the .json are NDJSON so some json tools don't like them
* _Massive_ files. They hang scrub.ts for a long long time (had to run these piecemeal)
* These files also have an _incredible_ amount of shapes and variance.
* Instead of outputing all the shapes I made a sort of "super-object" to capture the shape with `jq -n '[inputs] | add' events-2021-00000-of-00001.json.tmp > unique_shape.json` and then scrubbing `unique_shape.json`
* `messages/`
* I hand did these to keep all the ids the same
* There are multiple types of chats. DMs, guild channels, etc
* I hand did the csvs as I have no scrubber for that
* These are only **THE EXPORTING USERS MESSAGES**, no other user, just fyi
* Ids in `messages.csv` are just the id of the message, not of any user
* There is the potential to derive missing info from a channel via `@` tags sent or possibly via attachments. Maybe...
* `11111111111111111`
* This one has a shorter id (it's an older one)
* Has `type: 0` but there's no guild information in `channel.json`
* The user name was `null` in `index.json`
* It's a really odd one
* `222222222222222222`
* This was a dm channel (said `direct message with xxx#7777` in index.json)
* Has `type: 1` and there are two recipients (just the ids) in `channel.json`
* Unfortunately that's all the info in the export
* `333333333333333333`
* This was a normal guild channel
* `type: 0` and there's guild information in `channel.json`
* I kept a good set of messages around from this one to show how attachements and other stuff works
* The last message seemed to be a link not as an attachment. Links just seem to be normal text
* `programs/`
* was empty...
* `servers/``
* Info about _some_ of the guilds we have ids for
* guild.json didn't really contain anything except the name
* I kept around the only guild I noticed an audit-log.json with info in it

View file

@ -0,0 +1,26 @@
__ __ ___ _ _ ___ ___ ___ _____ ___ _
\ \ / / / _ \ | | | | | _ \ o O O | \ / \ |_ _| / \ | |
\ V / | (_) | | |_| | | / o | |) | | - | | | | - | |_|
_|_|_ \___/ \___/ |_|_\ TS__[O] |___/ |_|_| _|_|_ |_|_| _(_)_
_| """ |_|"""""|_|"""""|_|"""""| <======|_|"""""|_|"""""|_|"""""|_|"""""|_| """ |
"`-0-0-'"`-0-0-'"`-0-0-'"`-0-0-'./o--000'"`-0-0-'"`-0-0-'"`-0-0-'"`-0-0-'"`-0-0-'
___ ___ _ _ ___ ___ ___ _ _ _
|_ _| / __| o O O | || | | __| | _ \ | __| | | | | | |
| | \__ \ o | __ | | _| | / | _| |_| |_| |_|
|___| |___/ TS__[O] |_||_| |___| |_|_\ |___| _(_)_ _(_)_ _(_)_
_|"""""|_|"""""| <======|_|"""""|_|"""""|_|"""""|_|"""""|_| """ |_| """ |_| """ |
"`-0-0-'"`-0-0-'./o--000'"`-0-0-'"`-0-0-'"`-0-0-'"`-0-0-'"`-0-0-'"`-0-0-'"`-0-0-'
Welcome to your Discord Data Package!
Inside, you'll find a few JSON (JavaScript Object Notation) and CSV (Comma Separated Values) files
of the data we use to provide Discord's service to you. We've chosen these formats for ease of
processing. Furthermore, the files have been organized into logical groups to make it easy to
understand and work with (at least, we hope so)!
For more information, you can view our in-depth help article at the following URL:
https://support.discord.com/hc/articles/360004957991
All the best,
Discord Team

View file

@ -0,0 +1,16 @@
{
"id": "111111111111111111",
"name": "xxxxxxx",
"icon": null,
"description": "",
"summary": "",
"hook": false,
"verify_key": "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1",
"flags": 1,
"secret": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"redirect_uris": [],
"rpc_application_state": 1,
"store_application_state": 1,
"verification_state": 1,
"interactions_endpoint_url": null
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.7 KiB

View file

@ -0,0 +1,399 @@
{
"id": "111111111111111111",
"username": "xxxxxxxx",
"discriminator": 1111,
"email": "not_a_real_email@example.com",
"verified": false,
"avatar_hash": "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1",
"has_mobile": false,
"needs_email_verification": false,
"premium_until": "2020-04-13T10:09:08.000000+00:00",
"flags": 11111111111111,
"phone": "xxxxxxxxxxxx",
"temp_banned_until": null,
"ip": "1.1.1.1",
"settings": {
"locale": "xxxxx",
"show_current_game": false,
"restricted_guilds": [],
"default_guilds_restricted": false,
"inline_attachment_media": false,
"inline_embed_media": false,
"gif_auto_play": false,
"render_embeds": false,
"render_reactions": false,
"animate_emoji": false,
"enable_tts_command": false,
"message_display_compact": false,
"convert_emoticons": false,
"explicit_content_filter": 1,
"disable_games_tab": false,
"theme": "xxxx",
"developer_mode": false,
"guild_positions": [
"111111111111111111",
"111111111111111111"
],
"detect_platform_accounts": false,
"status": "xxxxxx",
"afk_timeout": 111,
"timezone_offset": 111,
"stream_notifications_enabled": false,
"allow_accessibility_detection": false,
"contact_sync_enabled": false,
"native_phone_integration_enabled": false,
"animate_stickers": 1,
"friend_source_flags": {
"all": false
},
"guild_folders": [
{
"guild_ids": [
"111111111111111111"
],
"id": null,
"name": null,
"color": null
},
{
"guild_ids": [
"111111111111111111"
],
"id": null,
"name": null,
"color": null
}
],
"custom_status": null
},
"connections": [
{
"type": "xxxxxxxxx",
"id": "xxxxxxxxxxx",
"name": "xxxxxxxxxxx",
"revoked": false,
"visibility": 1,
"friend_sync": false,
"show_activity": false,
"verified": false
},
{
"type": "xxxxxxx",
"id": "xxxxxxxx",
"name": "xxxxxxxx",
"revoked": false,
"visibility": 1,
"friend_sync": false,
"show_activity": false,
"verified": false
}
],
"external_friends_lists": [
{
"user_id": "111111111111111111",
"platform_type": "xxxxx",
"name": "xxxxxxxx",
"id_hash": "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1",
"friend_id_hashes": [
"a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1",
"a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1"
]
},
{
"user_id": "111111111111111111",
"platform_type": "xxxxxxxxx",
"name": "xxxxxxxxxxx",
"id_hash": "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1",
"friend_id_hashes": [
"a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1",
"a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1"
]
}
],
"friend_suggestions": [],
"mfa_sessions": [],
"relationships": [
{
"id": "11111111111111111",
"type": 1,
"nickname": null,
"user": {
"id": "11111111111111111",
"username": "xxxxxxxxxxxx",
"avatar": "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1",
"discriminator": "1111",
"public_flags": 1
}
},
{
"id": "11111111111111111",
"type": 1,
"nickname": null,
"user": {
"id": "11111111111111111",
"username": "xxxx",
"avatar": "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1",
"discriminator": "1111",
"public_flags": 111
}
}
],
"payments": [
{
"id": "111111111111111111",
"created_at": "2020-04-13T10:09:08.000000+00:00",
"currency": "xxx",
"tax": 111,
"tax_inclusive": false,
"amount": 1111,
"amount_refunded": 1,
"status": 1,
"description": "xxxxxxxxxxxxxxxxxxxx",
"flags": 1,
"subscription": {
"id": "111111111111111111",
"type": 1,
"current_period_start": "2020-04-13T10:09:08.000000+00:00",
"current_period_end": "2020-04-13T10:09:08.000000+00:00",
"payment_gateway": null,
"payment_gateway_plan_id": "xxxxxxxxxxxxxxxxxxx",
"currency": "xxx",
"plan_id": "111111111111111111",
"items": [
{
"id": "111111111111111111",
"plan_id": "111111111111111111",
"quantity": 1
}
]
},
"payment_source": {
"id": "111111111111111111",
"type": 1,
"invalid": false,
"brand": "xxxx",
"last_4": "1111",
"expires_month": 11,
"expires_year": 1111,
"billing_address": {
"name": "xxxxxxxxxxxxx",
"line_1": "xxxxxxxxxxxxxxxxx",
"line_2": null,
"city": "xxxxxxxx",
"state": "xx",
"country": "xx",
"postal_code": "11111"
},
"country": "xx"
},
"sku_id": "111111111111111111",
"sku_price": 1111,
"sku_subscription_plan_id": "111111111111111111"
},
{
"id": "111111111111111111",
"created_at": "2020-04-13T10:09:08.000000+00:00",
"currency": "xxx",
"tax": 111,
"tax_inclusive": false,
"amount": 1111,
"amount_refunded": 1,
"status": 1,
"description": "xxxxxxxxxxxxxxxxxxxx",
"flags": 1,
"subscription": {
"id": "111111111111111111",
"type": 1,
"current_period_start": "2020-04-13T10:09:08.000000+00:00",
"current_period_end": "2020-04-13T10:09:08.000000+00:00",
"payment_gateway": null,
"payment_gateway_plan_id": "xxxxxxxxxxxxxxxxxxx",
"currency": "xxx",
"plan_id": "111111111111111111",
"items": [
{
"id": "111111111111111111",
"plan_id": "111111111111111111",
"quantity": 1
}
]
},
"payment_source": {
"id": "111111111111111111",
"type": 1,
"invalid": false,
"brand": "xxxx",
"last_4": "1111",
"expires_month": 11,
"expires_year": 1111,
"billing_address": {
"name": "xxxxxxxxxxxxx",
"line_1": "xxxxxxxxxxxxxxxxxx",
"line_2": null,
"city": "xxxxxxxxxx",
"state": "xx",
"country": "xx",
"postal_code": "11111"
},
"country": "xx"
},
"sku_id": "111111111111111111",
"sku_price": 1111,
"sku_subscription_plan_id": "111111111111111111"
}
],
"payment_sources": [
{
"id": "111111111111111111",
"type": 1,
"invalid": false,
"brand": "xxxx",
"last_4": "1111",
"expires_month": 11,
"expires_year": 1111,
"billing_address": {
"name": "xxxxxxxxxxxxx",
"line_1": "xxxxxxxxxxxxxxxxx",
"line_2": null,
"city": "xxxxxxxx",
"state": "xx",
"country": "xx",
"postal_code": "11111"
},
"country": "xx"
}
],
"guild_settings": [
{
"guild_id": null,
"suppress_everyone": false,
"suppress_roles": false,
"message_notifications": 1,
"mobile_push": false,
"muted": false,
"mute_config": null,
"channel_overrides": [
{
"channel_id": "111111111111111111",
"message_notifications": 1,
"muted": false,
"mute_config": null
}
],
"version": 11
},
{
"guild_id": "11111111111111111",
"suppress_everyone": false,
"suppress_roles": false,
"message_notifications": 1,
"mobile_push": false,
"muted": false,
"mute_config": null,
"channel_overrides": [
{
"channel_id": "111111111111111111",
"message_notifications": 1,
"muted": false,
"mute_config": null
},
{
"channel_id": "111111111111111111",
"message_notifications": 1,
"muted": false,
"mute_config": null
}
],
"version": 1
}
],
"library_applications": [
{
"application": {
"id": "111111111111111111",
"name": "xxxxxxxxxxxx",
"icon": "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1",
"description": "xxxxxxxxxxxxxxxxxxxxx",
"summary": "xxxxxxxxxxxxxxxxxxxxx",
"primary_sku_id": "111111111111111111",
"hook": false,
"slug": "xxxxxxxxxxxx",
"guild_id": "111111111111111111",
"verify_key": "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1",
"publishers": [
{
"id": "111111111111111111",
"name": "xxxxxxxxxxx"
}
],
"developers": [
{
"id": "111111111111111111",
"name": "xxxxxxxxxxx"
},
{
"id": "111111111111111111",
"name": "xxxxxxxxxxxxxxxxxxxxxxxx"
}
]
},
"branch_id": "111111111111111111",
"sku_id": "111111111111111111",
"sku": {
"id": "111111111111111111",
"type": 1,
"premium": false,
"preorder_release_at": null,
"preorder_approximate_release_date": null
},
"flags": 1,
"created_at": "2020-04-13T10:09:08.000000+00:00",
"entitlements": [
{
"id": "111111111111111111",
"sku_id": "111111111111111111",
"application_id": "111111111111111111",
"user_id": "111111111111111111",
"type": 1,
"deleted": false,
"gift_code_flags": 1,
"branches": [
"111111111111111111"
]
}
]
}
],
"entitlements": [
{
"id": "111111111111111111",
"sku_id": "111111111111111111",
"application_id": "111111111111111111",
"user_id": "111111111111111111",
"type": 1,
"deleted": false,
"gift_code_flags": 1,
"branches": [
"111111111111111111"
],
"sku_name": "xxxxxxxxxxxx"
}
],
"user_activity_application_statistics": [
{
"application_id": "111111111111111111",
"last_played_at": "2020-04-13T10:09:08.000000+00:00",
"total_duration": 1111,
"total_discord_sku_duration": 1
},
{
"application_id": "111111111111111111",
"last_played_at": "2020-04-13T10:09:08.000000+00:00",
"total_duration": 111111,
"total_discord_sku_duration": 1
}
],
"notes": {
"111111111111111111": "xxxx"
}
}

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1 @@
{"id": "11111111111111111", "type": 0}

View file

@ -0,0 +1,2 @@
ID,Timestamp,Contents,Attachments
8888888888,2022-02-22 22:22:22.222222+00:00,Heyo,
1 ID Timestamp Contents Attachments
2 8888888888 2022-02-22 22:22:22.222222+00:00 Heyo

View file

@ -0,0 +1 @@
{"id": "222222222222222222", "type": 1, "recipients": ["00000000000000000", "1111111111111111"]}

View file

@ -0,0 +1,2 @@
ID,Timestamp,Contents,Attachments
2222222222222,2022-22-22 22:22:22.22222+00:00,Heyo,
1 ID Timestamp Contents Attachments
2 2222222222222 2022-22-22 22:22:22.22222+00:00 Heyo

View file

@ -0,0 +1 @@
{"id": "333333333333333333", "type": 0, "name": "generalchat", "guild": {"id": "333333333333333332", "name": "xxx"}}

View file

@ -0,0 +1,6 @@
ID,Timestamp,Contents,Attachments
000000000000000005,2011-02-02 02:05:02.000000+00:00,Huh what the heck is this message,
000000000000000004,2011-02-02 02:04:02.000000+00:00,<:thonk:000000000000000000><:thonk:000000000000000000><:thonk:000000000000000000>,
000000000000000003,2011-02-02 02:03:02.000000+00:00,"(so <@00000000000000000> who are you)",
000000000000000002,2011-02-02 02:02:02.000000+00:00,,https://cdn.discordapp.com/attachments/000000000000000000/000000000000000000/image.png
000000000000000001,2011-02-02 02:01:02.000000+00:00,https://google.com/whatever,
1 ID Timestamp Contents Attachments
2 000000000000000005 2011-02-02 02:05:02.000000+00:00 Huh what the heck is this message
3 000000000000000004 2011-02-02 02:04:02.000000+00:00 <:thonk:000000000000000000><:thonk:000000000000000000><:thonk:000000000000000000>
4 000000000000000003 2011-02-02 02:03:02.000000+00:00 (so <@00000000000000000> who are you)
5 000000000000000002 2011-02-02 02:02:02.000000+00:00 https://cdn.discordapp.com/attachments/000000000000000000/000000000000000000/image.png
6 000000000000000001 2011-02-02 02:01:02.000000+00:00 https://google.com/whatever

View file

@ -0,0 +1,5 @@
{
"11111111111111111": null,
"222222222222222222": "Direct Message with xxx#7777",
"333333333333333333": "generalchat"
}

View file

@ -0,0 +1,18 @@
[
{
"id": "111111111111111111",
"user_id": "111111111111111111",
"action_type": 11,
"changes": [
{
"key": "xxxx",
"new_value": [
{
"name": "xxxxxxxxxx",
"id": "111111111111111111"
}
]
}
]
}
]

View file

@ -0,0 +1,4 @@
{
"id": "444444444444444444",
"name": "xxx"
}

View file

@ -0,0 +1,3 @@
{
"444444444444444444": "xxx"
}

9
test/fixtures/facebook-json.md vendored Normal file
View file

@ -0,0 +1,9 @@
# facebook-json exports
## `facebook-json-2021-05-01`
* Manual edits of images -> placeholders, folder names, key names (in support cases specficially)
* This was one of the first few datasets I scrubbed so a lot of manual work was done. Should be easier now
* I went poking around this one and there was no exif on any of the images I looked at, only in the json was there exif
## `facebook-json-2025-11-29`
* Manual edits of images -> placeholders, folder names, key names
* This was one of the first few datasets I scrubbed so a lot of manual work was done. Should be easier now

31
test/fixtures/fitbit-2026-02.md vendored Normal file
View file

@ -0,0 +1,31 @@
# fitbit-2026-02
## Manual edits / notes
* Some many of these files are `category-2020-04-13.json` or `.csv` and then there's like 100 files. I had to manually delete all the extras and just keep one or two around. Im not keeping 2 for all of them because it's too much manual editing right now
* `Social`
* `badge.json` kept some of the type names
* `Sleep`
* `sleep_score.csv` Manually kept magnitude of last index as scrubber was not good at preserving that
* `sleep-xxx.json` Pretty sure there are multiple shapes on this based on the type. In the UI it presents them differently (one showing just asleep/not asleep, the other showing like rem and such)
* **TODO** - I only have one of the types in the export data in the fixture im pretty sure. Need to add the other one
* `Physical Activity`
* `time_in_heart_rate_zones-xxx.json` WHY DOES THIS USE MM/DD/YY DATES, ughhhhh
* `swim_lengths_data-xxx.json` I don't really swim so I dont know why there's so much data in here
* `sedentary_minutes-xxx.json` Hmm, this one has a lot of 1440, even though I did not wear my fitbit for a lot of those days or it failed to sync, so probably just defaults... I see that in a lot of this data
* `resting_heart_rate-xxx.json` Yeah... This one has a bunch of null objects in it. So if you want to parse any of this data, you're going to have to filter out all the days you weren't wearing your device manually
* I also added an extra entry to this file for the nulls to show up in the export with
* `exercise-100.json` These are weird, seems to be a suffix of like `-\d+`. Not really any specific pattern either, I only see 0 and 100 in here
* `distance-xxx.json` This one seems to be minute-by-minute, but only for some minutes. It's kinda weird. Idk if these are supposed to be like number since last message, or number since last minute (but the last minute was missed), or what...
* `calories-xxx.json` Why does the default value here seem to be 0.95 for everything, ugh
* `Active Zone Minutes - xxx.csv` Added stuff the types back to this
* UGH this uses `2020-04-13T10:10` for the times, wtf why
* `Personal & Account`
* `weight-xxx.json` this uses `MM/DD/YY` and `HH:MM:ss` seperately. Manually had to fix
* `Heart`
* `afib_ppg_enrollment.csv` Another off the wall date format `Fri Dec 19 06:32:30 UTC 2025`. Going to manually edit this
* `Biometrics`
* `Glucose xxx.csv` no data in all of these. they're not even proper csvs when there's no data... ugh. The only info is that there's year and month inside the filename
* `Google Data`
* There's so much overlap here with the other stuff, but some of it looks better handled, others don't
* `Physical Activity/daily_heart_rate_zones.csv` - Fuck this file, JSON embedded in CSV why, they did not need to do this...

View file

@ -0,0 +1,10 @@
timestamp,event_name,email,location,ip,outcome,reason,application,device_info
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxx,not_a_real_email@example.com,xxxxxxxxxxxxxxxxxxxxxxx,1.1.1.1,xxxxxxx,,xxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxxxxxxxxxxx,not_a_real_email@example.com,xxxxxxxxxxxxxxxxxxxxxxx,1.1.1.1,xxxxxxx,xxxxxxxxxxxxxx,xxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Mon Apr 13 10:09:08 UTC 2020,xxxxxx,not_a_real_email@example.com,xxxxxxxxxxxxxxxxxxxxxxx,1.1.1.1,xxxxxxx,,xxxxxxxxxxxxxxxxxxxxxx,
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxx,not_a_real_email@example.com,xxxxxxxxxxxxxxxxxxxxxxx,1.1.1.1,xxxxxxx,,xxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxxxxxxxxxxx,not_a_real_email@example.com,xxxxxxxxxxxxxxxxxxxxxxx,1.1.1.1,xxxxxxx,xxxxxxxxxxxxxx,xxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxx,not_a_real_email@example.com,xxxxxxxxxxxxxxxxxxxxxxx,1.1.1.1,xxxxxxx,,xxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxxxxxxxxxxx,not_a_real_email@example.com,xxxxxxxxxxxxxxxxxxxxxxx,1.1.1.1,xxxxxxx,xxxxxxxxxxxxxx,xxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxx,not_a_real_email@example.com,xxxxxxxxxxxxxxxxxxxxxxx,1.1.1.1,xxxxxxx,,xxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxxxxxxxxxxx,not_a_real_email@example.com,xxxxxxxxxxxxxxxxxxxxxxx,1.1.1.1,xxxxxxx,xxxxxxxxxxxxxx,xxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
1 timestamp event_name email location ip outcome reason application device_info
2 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxx not_a_real_email@example.com xxxxxxxxxxxxxxxxxxxxxxx 1.1.1.1 xxxxxxx xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
3 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxxxxxxxxxxx not_a_real_email@example.com xxxxxxxxxxxxxxxxxxxxxxx 1.1.1.1 xxxxxxx xxxxxxxxxxxxxx xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
4 Mon Apr 13 10:09:08 UTC 2020 xxxxxx not_a_real_email@example.com xxxxxxxxxxxxxxxxxxxxxxx 1.1.1.1 xxxxxxx xxxxxxxxxxxxxxxxxxxxxx
5 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxx not_a_real_email@example.com xxxxxxxxxxxxxxxxxxxxxxx 1.1.1.1 xxxxxxx xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
6 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxxxxxxxxxxx not_a_real_email@example.com xxxxxxxxxxxxxxxxxxxxxxx 1.1.1.1 xxxxxxx xxxxxxxxxxxxxx xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
7 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxx not_a_real_email@example.com xxxxxxxxxxxxxxxxxxxxxxx 1.1.1.1 xxxxxxx xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
8 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxxxxxxxxxxx not_a_real_email@example.com xxxxxxxxxxxxxxxxxxxxxxx 1.1.1.1 xxxxxxx xxxxxxxxxxxxxx xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
9 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxx not_a_real_email@example.com xxxxxxxxxxxxxxxxxxxxxxx 1.1.1.1 xxxxxxx xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
10 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxxxxxxxxxxx not_a_real_email@example.com xxxxxxxxxxxxxxxxxxxxxxx 1.1.1.1 xxxxxxx xxxxxxxxxxxxxx xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

View file

@ -0,0 +1,4 @@
timestamp,event_name,email,location,ip,outcome,reason
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxxxxxxx,not_a_real_email@example.com,,1.1.1.1,xxxxxxx,
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxxxxxxx,not_a_real_email@example.com,,1.1.1.1,xxxxxxx,
Mon Apr 13 10:09:08 UTC 2020,xxxxxxxxxxxxxxxxxxx,not_a_real_email@example.com,,1.1.1.1,xxxxxxx,
1 timestamp event_name email location ip outcome reason
2 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxxxxxxx not_a_real_email@example.com 1.1.1.1 xxxxxxx
3 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxxxxxxx not_a_real_email@example.com 1.1.1.1 xxxxxxx
4 Mon Apr 13 10:09:08 UTC 2020 xxxxxxxxxxxxxxxxxxx not_a_real_email@example.com 1.1.1.1 xxxxxxx

View file

@ -0,0 +1,42 @@
Coach Data Export
The Coach category of your data export includes the pieces of content that you favorited in the Coach view, as well as
content recommendations that were generated for you based on watch history, wellbeing and physical activity.
Files included:
----------
Coach Favorites.csv
This includes the items that you marked as favorite in the Coach view.
timestamp - Datetime of the moment the item was favorited
id - Unique identifier of the item
title - Name of the item
bundle_id - Category of content in the Coach view that the item belongs to
content_type - Content type of the item
----------
Coach Content Recommendations.csv
This is the list of videos/audios that were recently recommended to you based on your and other users' viewing history.
date - Date when the recommendation was generated
id - Unique identifier of the item
title - Name of the item
bundle_id - Category of content in the Coach view that the item belongs to
content_type - Content type of the item
rating - Rating representing how good the recommendation was computed to be
----------
Coach Dynamic Recommendations.csv
This file contains the list of content rows that were personalized for you and that were embedded in pages in the app
other than the main Coach view.
timestamp - Datetime of when the content row was determined
component_id - Unique identifier of the content row
bundle_id - Category of content in the Coach view that the item belongs to
id - Unique identifier of the item
title - Name of the item
content_type - Content type of the item
associated_tags - Tags describing the items in the content row and which can be filtered upon

View file

@ -0,0 +1,2 @@
previous_email,change_time,request_id
not_a_real_email@example.com,2020-04-13T10:09:08.000000Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
1 previous_email change_time request_id
2 not_a_real_email@example.com 2020-04-13T10:09:08.000000Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

View file

@ -0,0 +1,2 @@
date_changed,reason
2020-04-13T10:09:08.000000Z,some/path
1 date_changed reason
2 2020-04-13T10:09:08.000000Z some/path

View file

@ -0,0 +1,39 @@
Biometrics Data Export
Description: For users who have access and started to use biometrics features (such as Blood Glucose) in the Fitbit app this category of the data export includes all of the content added via those features. This includes your biometrics and other associated data, including annotations, time and type of measurement, your personal ranges and reminders.
Files Included:
----------
Glucose Reminders.csv
The list of reminders that you created in the Fitbit app.
time - Time
days - Days of week
enabled - Whether this remainder enabled
----------
Glucose Target Ranges.csv
Your blood glucose personal target range.
min - Target range (min)
max - Target range (max)
----------
Glucose YYYYMM.csv
Each file holds the list of blood glucose values and associated data for the specific month (defined by YYYY-MM).
time - Entry date and time
value - Value
unit - Unit (MMOL_L / MG_DL)
data_source - Description of data source (UNKNOWN / MANUAL / APP)
measurement_type - Entry type (UNSPECIFIED / SMBG / CGM / LAB_TEST)
medical_codes - List of medical codes (if available) (LOINC and/or SNOMED)
tags - List of associated annotations

View file

@ -0,0 +1 @@
no data
1 no data

View file

@ -0,0 +1 @@
To access your order history and related information for orders placed between 2008 and 2024, please contact Google customer support at https://support.google.com/fitbit/gethelp.

View file

@ -0,0 +1,15 @@
Calibration Status for Readiness and Load
The CalibrationStatusForReadinessAndLoad file contains the calibration status for the Readiness and Load features.
When the remaining number of days is 0, the user is considered calibrated.
----------
CalibrationStatusForReadinessAndLoad.csv
user_id - unique id for the user
feature - name of the feature
remaining_days - the remaining number of days required to complete the calibration
calibration_start_date - the date when calibration was started, local date
latest_completion_date - the date when calibration was completed, local date
latest_update_date - the date when the calibration status was last updated, local date

View file

@ -0,0 +1,3 @@
feature,remaining_days,calibration_start_date,latest_completion_date,latest_update_date
xxxxxxxxxxxxxxxxxxxxxxxx,1,2020-04-13,2020-04-13,2020-04-13
xxxxxxxxxxxxxxxxxxxxxx,1,2020-04-13,2020-04-13,2020-04-13
1 feature remaining_days calibration_start_date latest_completion_date latest_update_date
2 xxxxxxxxxxxxxxxxxxxxxxxx 1 2020-04-13 2020-04-13 2020-04-13
3 xxxxxxxxxxxxxxxxxxxxxx 1 2020-04-13 2020-04-13 2020-04-13

View file

@ -0,0 +1,14 @@
Goal Settings History
The GoalSettingsHistory file contains the settings history for the user goals.
Files Included:
----------
GoalSettingsHistory.csv
name - name of the goal
objectives - objectives of the goal containing the target value and the metric to measure
schedule - schedule of the goal, weekly or fixed datetime range
status - status of the goal, enabled or disabled
update_time - time when the goal was updated with these settings

View file

@ -0,0 +1,21 @@
name,objectives,schedule,status,update_time,meta,title,subtitle,rationale,domain,progress_start_time,progress_end_time,progress
xxxxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx,xxxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx,xxxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,2020-04-13 10:09:08 - 2020-04-13 10:09:08,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,2020-04-13 10:09:08 - 2020-04-13 10:09:08,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,2020-04-13 10:09:08 - 2020-04-13 10:09:08,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,2020-04-13 10:09:08 - 2020-04-13 10:09:08,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,2020-04-13 10:09:08 - 2020-04-13 10:09:08,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,2020-04-13 10:09:08 - 2020-04-13 10:09:08,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,2020-04-13 10:09:08 - 2020-04-13 10:09:08,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,2020-04-13 10:09:08 - 2020-04-13 10:09:08,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,2020-04-13 10:09:08 - 2020-04-13 10:09:08,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
xxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,2020-04-13 10:09:08 - 2020-04-13 10:09:08,xxxxxxx,2020-04-13 10:09:08+0000,,,,,xxxxx,,,
1 name objectives schedule status update_time meta title subtitle rationale domain progress_start_time progress_end_time progress
2 xxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
3 xxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
4 xxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxxxxxx 2020-04-13 10:09:08+0000 xxxxx
5 xxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
6 xxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
7 xxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
8 xxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
9 xxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
10 xxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxxxxxx 2020-04-13 10:09:08+0000 xxxxx
11 xxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
12 xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2020-04-13 10:09:08 - 2020-04-13 10:09:08 xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
13 xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2020-04-13 10:09:08 - 2020-04-13 10:09:08 xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
14 xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2020-04-13 10:09:08 - 2020-04-13 10:09:08 xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
15 xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2020-04-13 10:09:08 - 2020-04-13 10:09:08 xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
16 xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2020-04-13 10:09:08 - 2020-04-13 10:09:08 xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
17 xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2020-04-13 10:09:08 - 2020-04-13 10:09:08 xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
18 xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2020-04-13 10:09:08 - 2020-04-13 10:09:08 xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
19 xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2020-04-13 10:09:08 - 2020-04-13 10:09:08 xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
20 xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2020-04-13 10:09:08 - 2020-04-13 10:09:08 xxxxxxx 2020-04-13 10:09:08+0000 xxxxx
21 xxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2020-04-13 10:09:08 - 2020-04-13 10:09:08 xxxxxxx 2020-04-13 10:09:08+0000 xxxxx

View file

@ -0,0 +1,65 @@
Irregular Rhythm Notifications
The below section is dedicated to the exported data of the Irregular Rhythm
Notifications (IRN) data domain. The IRN feature analyzes your heart rhythm for
signs of atrial fibrillation (AFib).
Files Included:
----------
IrnUserState.csv
The data for the user's state with respect to the Irregular Rhythm Notifications feature.
EnrollmentState - The user's enrollment status in the IRN feature
(e.g., ENROLLED).
LastProcessedTime - The timestamp of the last time the user's heart
rhythm data was processed.
LastConclusiveWindow - The timestamp of the end of the last window of
data that was considered conclusive (either positive or negative for AFib).
LastProcessedTimestamps - A JSON array detailing the last processed
timestamp for each data source (e.g., each device).
LastNotifiedTime - The timestamp of the last time a notification was
sent to the user.
IrnAfibAlertWindows.csv
The data for individual AFib analysis windows. An alert is generated from one or more of these windows.
DeviceId - The identifier of the device that recorded the
data.
DeviceFitbitDeviceType - The model of the Fitbit device (e.g., ANTARES).
AlgorithmVersion - The version of the AFib detection algorithm used.
ServiceVersion - The version of the backend service that processed
the data.
StartTime - The start time of the analysis window.
Positive - A boolean indicating if the window was positive
for signs of AFib.
HeartBeats - A JSON array of heartbeats recorded during the
window, including the timestamp and beats per
minute for each.
IrnAfibAlerts.csv
The data for AFib alerts sent to the user.
DeviceId - The identifier of the device that recorded the
data.
DeviceFitbitDeviceType - The model of the Fitbit device.
AlgorithmVersion - The version of the AFib detection algorithm used.
ServiceVersion - The version of the backend service that processed
the data.
StartTime - The start time of the first analysis window in
the alert.
EndTime - The end time of the last analysis window in the
alert.
DetectedTime - The timestamp when the alert was officially
generated.
AlertWindows - A JSON array of the individual analysis windows
that constitute the alert. Each window includes
its start and end times, a positive flag, and the
associated heartbeats.
IsRead - A boolean indicating if the user has viewed the
notification.

View file

@ -0,0 +1,2 @@
enrollment_state,last_processed_time,last_conclusive_window,last_processed_timestamps,last_notified_time
xxxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,
1 enrollment_state last_processed_time last_conclusive_window last_processed_timestamps last_notified_time
2 xxxxxxxx 2020-04-13 10:09:08+0000 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

View file

@ -0,0 +1,20 @@
UserAppSettingData Export
The UserAppSetting data file contains user prerferences such as measurment units, height and weight system etc.
Files included:
----------
UserAppSettingData.csv
The data for User AppSettings
preferred_workout_intensity_level - the preferred workout intensity level of the user
height_system - the height system of the user
weight_system - the weight system of the user
water_measurement_unit - the water measurement unit of the user
glucose_measurement_unit - the glucose measurement unit of the user
body_temperature_measurement_unit - the body temperature measurement unit of the user
pool_length - the pool length of the user (e.g. 16 units)
pool_length_measurement_unit - the pool length measurement unit of the user
swim_unit - the swim unit of the user

View file

@ -0,0 +1,8 @@
value_time,setting_name,setting_value
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxxxxxx,xxxxx
2020-04-13 10:09:08+0000,xxxxxxxxxxxxx,xx
2020-04-13 10:09:08+0000,xxxxxxxxxxx,11
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxx
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxxxx,xxxxx
2020-04-13 10:09:08+0000,xxxxxxxxxxxxx,xx
1 value_time setting_name setting_value
2 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx
3 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxxxxxx xxxxx
4 2020-04-13 10:09:08+0000 xxxxxxxxxxxxx xx
5 2020-04-13 10:09:08+0000 xxxxxxxxxxx 11
6 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxx
7 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxxxx xxxxx
8 2020-04-13 10:09:08+0000 xxxxxxxxxxxxx xx

View file

@ -0,0 +1,19 @@
UserDemographicData Export
The User Demographic data file contains information commonly used for creating statistics / automatic update emails, including country, state, is_child etc.
Files included:
----------
UserDemographicData.csv
The data for User Demographic:
country - the country of the user
state - the state of the user
sex - the gender of the user
timezone - the timezone of the user
locale - the locale of the user
is_child - whether the user is a child
Note that it is expected for migrated Google accounts to contain entries with default Fitbit values for date_of_birth ("1970-01-01").

View file

@ -0,0 +1,4 @@
value_time,setting_name,setting_value
2020-04-13 10:09:08+0000,xxxxxx,xxxxx
2020-04-13 10:09:08+0000,xxx,xxxxxx
2020-04-13 10:09:08+0000,xxxxxxxx,some/path
1 value_time setting_name setting_value
2 2020-04-13 10:09:08+0000 xxxxxx xxxxx
3 2020-04-13 10:09:08+0000 xxx xxxxxx
4 2020-04-13 10:09:08+0000 xxxxxxxx some/path

View file

@ -0,0 +1,85 @@
Exercises
The below section is dedicated to the exported data of the Exercise data domain.
Exercises are sent by the wearables device to the Backend and contain data about
the exercises performed by the user.
Files Included:
----------
UserExercises.csv
The data for exercises
exercise_id - the unique identifier of the exercise
exercise_start - the exercise start time at UTC
exercise_end - the exercise end time at UTC
utc_offset - the timezone offset relative to UTC for the exercise
exercise_created - the time when the exercise was created at UTC
exercise_last_updated - the time when the exercise was last updated at UTC
activity_name - the type of activity performed during the exercise
log_type - where the exercise was logged from (mobile, tracker, etc.)
pool_length - user's preferred pool length in the unit specified by PoolLengthUnit
pool_length_unit - pool length unit
intervals data about the intervals of the exercise (if the exercise was an interval workout).
it is listed as blocks of the following -
- type: the type of the interval (REST or MOVE)
- interval_num: the interval number
- total_intervals: the total number of intervals in the workout
- num_repeats: the number of times the interval was repeated
- duration_millis: the interval duration in milliseconds
distance_units - the units of the distance (imperial or metric)
tracker_total_calories - the total calories burned during the exercise (registered by the tracker)
tracker_total_steps - the total steps taken during the exercise (registered by the tracker)
tracker_total_distance_mm - the total distance in millimeters covered during the exercise (registered by the tracker)
tracker_total_altitude_mm - the total altitude in millimeters covered during the exercise (registered by the tracker)
tracker_avg_heart_rate - the average heart rate during the exercise (registered by the tracker)
tracker_peak_heart_rate - the peak heart rate during the exercise (registered by the tracker)
tracker_avg_pace_mm_per_second - the average pace in millimeters per second during the exercise (registered by the tracker)
tracker_avg_speed_mm_per_second - the average speed in millimeters per second during the exercise (registered by the tracker)
tracker_peak_speed_mm_per_second - the peak speed in millimeters per second during the exercise (registered by the tracker)
tracker_auto_stride_run_mm - the stride length when running in millimeters during the exercise (registered by the tracker)
tracker_auto_stride_walk_mm - the stride length when walking in millimeters during the exercise (registered by the tracker)
tracker_swim_lengths - the number of lengths swam during a swim exercise (registered by the tracker)
tracker_pool_length - the pool length in the unit specified by TrackerPoolLengthUnit (calculated by the tracker)
tracker_pool_length_unit - the pool length unit
tracker_cardio_load - the cardio load of the exercise (registered by the tracker)
manually_logged_total_calories - total calories burned during the exercise (manually logged by the user)
manually_logged_total_steps - total steps taken during the exercise (manually logged by the user)
manually_logged_total_distance_mm - total distance in millimeters covered during the exercise (manually logged by the user)
manually_logged_pool_length - the pool length in the unit specified by ManuallyLoggedPoolLengthUnit (manually logged by the user)
manually_logged_pool_length_unit - the pool length unit
exercise_events - data about the events that happen throughout the exercise such as start, stop, pause, split
- for SPLIT, AUTO_SPLIT and INTERVAL events, all the metrics are relative to the previous event
- for PAUSE, AUTO_PAUSE and STOP events, all the metrics are relative to the start of the exercise
it is listed as blocks of the following -
- exercise_event_id: the unique identifier of the event
- timestamp: the time when the event occurred at UTC
- type: the type of the event (START, STOP, PAUSE, RESUME etc.)
- auto_cue_type: the type of the auto cue (MANUAL, DISTANCE, TIME, CALORIES etc.)
- elapsed_time_millis: the elapsed time in milliseconds
- traveled_distance_mm: the distance traveled in millimeters
- calories_burned: the calories burned
- steps: the steps taken
- average_heart_rate: average heart rate
- elevation_gain_mm: elevation gain in millimeters
- swim_lengths: number of lengths swam
- average_speed_mm_per_sec: average speed in millimeters per second
- interval_type: the type of the interval (REST or MOVE)
activity_type_probabilities - a list of activities that the user might have performed during the exercise, with the probability of each activity
autodetected_confirmed - whether the user confirmed the autodetected exercise
autodetected_start_timestamp - the start time of the autodetected exercise at UTC
autodetected_end_timestamp - the end time of the autodetected exercise at UTC
autodetected_utc_offset - the timezone offset relative to UTC for the autodetected exercise
autodetected_activity_name - the name of the autodetected activity
autodetected_sensor_based_activity_name - the name of the sensor based autodetected activity
deletion_reason - the reason why the exercise was deleted
activity_label - the label of the activity
suggested_start_timestamp - the suggested start time of the exercise at UTC
suggested_end_timestamp - the suggested end time of the exercise at UTC
reconciliation_status - the status of the reconciliation

View file

@ -0,0 +1,21 @@
exercise_id,exercise_start,exercise_end,utc_offset,exercise_created,exercise_last_updated,activity_name,log_type,pool_length,pool_length_unit,intervals,distance_units,tracker_total_calories,tracker_total_steps,tracker_total_distance_mm,tracker_total_altitude_mm,tracker_avg_heart_rate,tracker_peak_heart_rate,tracker_avg_pace_mm_per_second,tracker_avg_speed_mm_per_second,tracker_peak_speed_mm_per_second,tracker_auto_stride_run_mm,tracker_auto_stride_walk_mm,tracker_swim_lengths,tracker_pool_length,tracker_pool_length_unit,tracker_cardio_load,manually_logged_total_calories,manually_logged_total_steps,manually_logged_total_distance_mm,manually_logged_pool_length,manually_logged_pool_length_unit,events,activity_type_probabilities,autodetected_confirmed,autodetected_start_timestamp,autodetected_end_timestamp,autodetected_utc_offset,autodetected_activity_name,autodetected_sensor_based_activity_name,deletion_reason,activity_label,suggested_start_timestamp,suggested_end_timestamp,reconciliation_status
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,
1111111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000,xxxxxxxxxxxx,xxxxxxxxxxxxx,1,xxxxxxxxxxx,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,xxxxxxxxxxx,,,,

View file

@ -0,0 +1,22 @@
UserLegacySettingData Export
The User Legacy Setting Data file contains legacy settings for the user, including
Files included:
----------
UserLegacySettingData.csv
The data for User Legacy Setting Data:
clock12 - whether the user has opted for a 12-hour clock display (e.g., AM/PM) instead of a 24-hour clock.
start_day_of_week - the user's start day of week (e.g., Monday, Sunday). This affects how weekly data is displayed in the app.
food_budget - whether the user has enabled a food budget feature, and the intensity level selected for it (e.g. maintenance, strict).
food_plan_estimation_enabled - whether the user has enabled food plan estimation feature.
legal_terms - the version of the legal terms the user has accepted.
sdk_developer_enabled - whether the user has enabled the SDK developer mode.
sdk_legal_terms_version - the version of the SDK legal terms the user has accepted.
weight_objective - the user's weight objective (e.g., lose, maintain, gain).
weight_track_start_date - the date the user started tracking their weight.
weight_goal_target_date - the user's weight goal target date.
food_database - the user's food database.

View file

@ -0,0 +1,5 @@
value_time,setting_name,setting_value
2020-04-13 10:09:08+0000,xxxxxxxxxxx,some/path
2020-04-13 10:09:08+0000,xxxxxxx,false
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxxx,false
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxx,xxxxxx
1 value_time setting_name setting_value
2 2020-04-13 10:09:08+0000 xxxxxxxxxxx some/path
3 2020-04-13 10:09:08+0000 xxxxxxx false
4 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxxx false
5 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxx xxxxxx

View file

@ -0,0 +1,27 @@
UserMBDData Export
The User MBD data file contains measured body data including sensitive fields such as is_nursing, pregnant_state, body_constitution etc.
Files included:
----------
UserMBDData.csv
The data for User MBD:
is_nursing - whether the user is nursing
pregnant_state - the pregnancy state of the user (not_pregnant, first_trimester etc.)
body_constitution - the body constitution of the user (unspecified,regular, lean)
hr_scaling_sleep_rest - the user's HR scaling sleep rest
stride_length_walking - the user's stride length walking (mm)
stride_length_running - the user's stride length running (mm)
auto_stride_length_walking - the user's auto stride length walking (mm)
auto_stride_length_running - the user's auto stride length running (mm)
auto_stride_enabled - whether the user's auto stride is enabled
auto_run_enabled - whether the user's auto run is enabled
activity_state - the user's activity state (sedentary, low_active, active etc.)
inactivity_alerts_days - the user's inactivity alerts days
sedentary_alert_times - the user's sedentary alert times
sedentary_prune_time - the user's sedentary prune time (UTC, no timezone offset)
stia_update_time - the user's STIA update time (UTC, no timezone offset)
sleep_proc_algorithm - the user's sleep process algorithm (unspecified, composite, sensitive)

View file

@ -0,0 +1,12 @@
value_time,setting_name,setting_value
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxx,xxxxxxxxx
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxx,false
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxx,false
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxx,xxxxxxx
2020-04-13 10:09:08+0000,xxxxxxxxxx,false
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxx,xxxxxxxxxxxx
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxx,xxxxxxxxx
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxxx,1111
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxxx,1
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxx,2020-04-13T10:09:08.000000Z
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
1 value_time setting_name setting_value
2 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxx xxxxxxxxx
3 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxx false
4 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxx false
5 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxx xxxxxxx
6 2020-04-13 10:09:08+0000 xxxxxxxxxx false
7 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxx xxxxxxxxxxxx
8 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxx xxxxxxxxx
9 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxxx 1111
10 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxxx 1
11 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxx 2020-04-13T10:09:08.000000Z
12 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

View file

@ -0,0 +1,15 @@
UserProfileData Export
The User Profile data file contains identifying information for a user's Fitbit account, including biography, username, first name etc.
Files included:
----------
UserProfileData.csv
The data for User Profile:
about_me - user biography, free text
display_name_preference - preference for display name (name, username, or full name)
Note that it is expected for migrated Google accounts to contain entries with default Fitbit values for first_name ("firstName") and last_name ("lastName").

View file

@ -0,0 +1,2 @@
value_time,setting_name,setting_value
2020-04-13 10:09:08+0000,xxxxxxxxxxxxxxxxxxxxxxx,xxxx
1 value_time setting_name setting_value
2 2020-04-13 10:09:08+0000 xxxxxxxxxxxxxxxxxxxxxxx xxxx

View file

@ -0,0 +1,39 @@
Sleep Scores
The below section is dedicated to the exported data of the Sleep Score data domain.
Sleep scores merge multiple sleep metrics (duration, composition, heart rate data)
into a summarized scoring system.
Files Included:
----------
UserSleepScores.csv
The data for sleep scores
user_id - the unique identifier of the user
sleep_id - the unique identifier of the sleep session
sleep_score_id - the unique identifier for the sleep score record
data_source - the method used to record this stage data (MANUAL, DERIVED, ACTIVELY_MEASURED, PASSIVELY_MEASURED)
score_utc_offset - timezone offset relative to UTC when the score was generated
score_time - the timestamp (UTC) when the score was generated
overall_score - the calculated overall sleep score
duration_score - sub-score reflecting duration alignment with sleep goals
composition_score - sub-score reflecting the ratio/balance of different sleep stages
revitalization_score - sub-score reflecting how restorative the sleep was (based on e.g. restlessness, HR, etc.)
sleep_time_minutes - total time spent asleep in minutes
deep_sleep_minutes - number of minutes spent in deep sleep
rem_sleep_percent - percentage of total sleep time in the REM stage
resting_heart_rate - measured resting heart rate during sleep
sleep_goal_minutes - the user's sleep goal, in minutes
waso_count_long_wakes - count of longer awakenings
waso_count_all_wake_time - total wake time after initially falling asleep
restlessness_normalized - a normalized measure of restlessness
hr_below_resting_hr - fraction of HR measurements that were below the previous day's resting HR
sleep_score_created - the creation timestamp of the sleep score record in UTC
sleep_score_last_updated - the last update timestamp of the sleep score record in UTC

View file

@ -0,0 +1,21 @@
sleep_id,sleep_score_id,data_source,score_utc_offset,score_time,overall_score,duration_score,composition_score,revitalization_score,sleep_time_minutes,deep_sleep_minutes,rem_sleep_percent,resting_heart_rate,sleep_goal_minutes,waso_count_long_wakes,waso_count_all_wake_time,restlessness_normalized,hr_below_resting_hr,sleep_score_created,sleep_score_last_updated
1111111111111111111,1111111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.111111111111111,-1,-1,-1,111,11,11.111111111111111,11,111,11.111111111111111,11,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.11111111111111,-1,-1,-1,111,11,11.11111111111111,11,111,1,11,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.111111111111111,-1,-1,-1,111,11,11.111111111111111,11,111,1,1,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.111111111111111,-1,-1,-1,111,11,1.1111111111111111,11,111,11.1,11,1.111111111111111111,1.111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,111111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.111111111111111,-1,-1,-1,111,11,11.111111111111111,11,111,1,1,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
111111111111111111,11111111111111111,xxxxxxx,+00:00,2020-04-13 10:09:08+0000,11.111,-1,-1,-1,111,11,11.11,11,111,1,11,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,+00:00,2020-04-13 10:09:08+0000,11.11,-1,-1,-1,111,11,11.11,11,111,1.1111111111111111,11,1.11111111111111111,1.1111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.111111111111111,-1,-1,-1,111,11,11.111111111111111,11,111,1.111111111111111,11,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.111111111111111,-1,-1,-1,111,11,11.111111111111111,11,111,1,1,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,+00:00,2020-04-13 10:09:08+0000,11.111,-1,-1,-1,111,111,11.11,11,111,11.111111111111111,11,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,+00:00,2020-04-13 10:09:08+0000,11.111,-1,-1,-1,111,11,11.11,11,111,1,1,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
111111111111111111,1111111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.111111111111111,-1,-1,-1,111,11,11.111111111111111,11,111,1.111111111111111,11,1.1111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.111111111111111,-1,-1,-1,111,11,11.111111111111111,11,111,1.1111111111111111,11,1.1111111111111111,1.1111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.111111111111111,-1,-1,-1,111,11,11.111111111111111,11,111,1,1,1.1111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,+00:00,2020-04-13 10:09:08+0000,11.111,-1,-1,-1,111,11,11.11,11,111,1,11,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,+00:00,2020-04-13 10:09:08+0000,11.111,-1,-1,-1,111,11,11.11,11,111,1.1111111111111111,11,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,+00:00,2020-04-13 10:09:08+0000,11.111,-1,-1,-1,111,11,11.11,11,111,11.111111111111111,11,1.1111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,+00:00,2020-04-13 10:09:08+0000,11.111,-1,-1,-1,111,11,11.11,11,111,1.1111111111111111,11,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,+00:00,2020-04-13 10:09:08+0000,11.11,-1,-1,-1,111,11,11.11,11,111,11.1,11,1.111111111111111111,1.1111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxxxx,-00:00,2020-04-13 10:09:08+0000,11.1111111111111,-1,-1,-1,111,11,11.111111111111111,11,111,11.111111111111111,11,1.111111111111111111,1.11111111111111111,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000

View file

@ -0,0 +1,26 @@
Sleep Stages
The below section is dedicated to the exported data of the Sleep Stage data domain.
Sleep stage data gives more detailed insight into how a user's sleep is distributed
across different stages.
Files Included:
----------
UserSleepStages.csv
The data for sleep stages
user_id - the unique identifier of the user
sleep_id - the unique identifier of the sleep session
sleep_stage_id - the unique identifier of the sleep stage entry
sleep_stage_type - the type of sleep stage (AWAKE, LIGHT, DEEP, REM)
start_utc_offset - timezone offset relative to UTC at the start of this sleep stage
sleep_stage_start - the start time of this sleep stage in UTC
end_utc_offset - timezone offset relative to UTC at the end of this sleep stage
sleep_stage_end - the end time of this sleep stage in UTC
data_source - the method used to record this stage data (MANUAL, DERIVED, ACTIVELY_MEASURED, PASSIVELY_MEASURED)
sleep_stage_created - the creation timestamp of the sleep stage record in UTC
sleep_stage_last_updated - the last update timestamp of the sleep stage record in UTC

View file

@ -0,0 +1,21 @@
sleep_id,sleep_stage_id,sleep_stage_type,start_utc_offset,sleep_stage_start,end_utc_offset,sleep_stage_end,data_source,sleep_stage_created,sleep_stage_last_updated
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxxxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,1111111111111111111,xxx,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000

View file

@ -0,0 +1,33 @@
Sleeps
The below section is dedicated to the exported data of the Sleep data domain.
Sleeps are sent by the wearable device (or manually entered) and contain data about
the user's sleep sessions.
Files Included:
----------
UserSleeps.csv
The data for sleeps
user_id - the unique identifier of the user
sleep_id - the unique identifier of the sleep session
sleep_type - the type of the sleep session (CLASSIC, STAGES)
minutes_in_sleep_period - the total number of minutes between going to bed and final wake-up
minutes_after_wake_up - total minutes after the user wakes up until they leave bed or stop tracking
minutes_to_fall_asleep - number of minutes it took the user to fall asleep
minutes_asleep - the total number of minutes the user was actually asleep
minutes_awake - the total number of minutes awake during the sleep session
minutes_longest_awakening - duration (in minutes) of the single longest awakening
minutes_to_persistent_sleep - number of minutes between going to bed and the onset of sustained sleep
start_utc_offset - timezone offset relative to UTC at the start of the sleep
sleep_start - the start time of the sleep session in UTC
end_utc_offset - timezone offset relative to UTC at the end of the sleep
sleep_end - the end time of the sleep session in UTC
data_source - the method used to record this sleep (MANUAL, DERIVED, ACTIVELY_MEASURED, PASSIVELY_MEASURED)
sleep_created - the creation timestamp of the sleep record in UTC
sleep_last_updated - the last update timestamp of the sleep record in UTC

View file

@ -0,0 +1,21 @@
sleep_id,sleep_type,minutes_in_sleep_period,minutes_after_wake_up,minutes_to_fall_asleep,minutes_asleep,minutes_awake,minutes_longest_awakening,minutes_to_persistent_sleep,start_utc_offset,sleep_start,end_utc_offset,sleep_end,data_source,sleep_created,sleep_last_updated
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxxx,111,1,1,111,1,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxxx,111,1,1,11,1,1,1,+00:00,2020-04-13 10:09:08+0000,+00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
111111111111111111,xxxxxx,111,1,1,111,11,1,1,+00:00,2020-04-13 10:09:08+0000,+00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,+00:00,2020-04-13 10:09:08+0000,+00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,+00:00,2020-04-13 10:09:08+0000,+00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,1,1,1,+00:00,2020-04-13 10:09:08+0000,+00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
111111111111111111,xxxxxx,111,1,1,111,11,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,11,1,1,-00:00,2020-04-13 10:09:08+0000,-00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,1,1,1,+00:00,2020-04-13 10:09:08+0000,+00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,1,1,1,+00:00,2020-04-13 10:09:08+0000,+00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxxx,11,1,1,11,1,1,1,+00:00,2020-04-13 10:09:08+0000,+00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000
1111111111111111111,xxxxxx,111,1,1,111,1,1,1,+00:00,2020-04-13 10:09:08+0000,+00:00,2020-04-13 10:09:08+0000,xxxxxxx,2020-04-13 10:09:08+0000,2020-04-13 10:09:08+0000

View file

@ -0,0 +1,21 @@
timestamp,light,moderate,very,data source
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
2020-04-13T10:09:08Z,1,1,1,xxxxxxxxxx
1 timestamp light moderate very data source
2 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
3 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
4 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
5 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
6 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
7 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
8 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
9 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
10 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
11 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
12 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
13 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
14 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
15 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
16 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
17 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
18 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
19 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
20 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx
21 2020-04-13T10:09:08Z 1 1 1 xxxxxxxxxx

View file

@ -0,0 +1,17 @@
Time Series Data Export
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxx
xxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

View file

@ -0,0 +1,21 @@
timestamp,heart rate zone,total minutes,data source
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxx,1,xxxxxxxxx
1 timestamp heart rate zone total minutes data source
2 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
3 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
4 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
5 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
6 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
7 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
8 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
9 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
10 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
11 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
12 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
13 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
14 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
15 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
16 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
17 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
18 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
19 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
20 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx
21 2020-04-13T10:09:08Z xxxxxxxx 1 xxxxxxxxx

View file

@ -0,0 +1,15 @@
Time Series Data Export
The Time Series export provides a detailed timeline of your tracked activity.
Files Included:
----------
active_zone_minutes_YYYY-MM-DD.csv - Where YYYY-MM-DD is the starting date for the entries in the file.
Each entry has the following values:
timestamp - Date and time at which the entry was logged.
heart rate zone - Heart rate zone: fat burn, cardio or peak.
total minutes - Total minutes equals to 1 for low intensity (fat burn) zones or 2 for high intensity zones (cardio, peak).
data source - The origin or source of this data.

View file

@ -0,0 +1,21 @@
timestamp,level,data source
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxx,xxxxxxxxxx
1 timestamp level data source
2 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxx xxxxxxxxxx
3 2020-04-13T10:09:08Z xxxxxxxxxxx xxxxxxxxxx
4 2020-04-13T10:09:08Z xxxxxxxxxxx xxxxxxxxxx
5 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxx xxxxxxxxxx
6 2020-04-13T10:09:08Z xxxxxxxxxxxxxx xxxxxxxxxx
7 2020-04-13T10:09:08Z xxxxxxxxxxxxxx xxxxxxxxxx
8 2020-04-13T10:09:08Z xxxxxxxxxxxxxx xxxxxxxxxx
9 2020-04-13T10:09:08Z xxxxxxxxx xxxxxxxxxx
10 2020-04-13T10:09:08Z xxxxxxxxxxxxxx xxxxxxxxxx
11 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxx xxxxxxxxxx
12 2020-04-13T10:09:08Z xxxxxxxxxxxxxx xxxxxxxxxx
13 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxx xxxxxxxxxx
14 2020-04-13T10:09:08Z xxxxxxxxx xxxxxxxxxx
15 2020-04-13T10:09:08Z xxxxxxxxx xxxxxxxxxx
16 2020-04-13T10:09:08Z xxxxxxxxx xxxxxxxxxx
17 2020-04-13T10:09:08Z xxxxxxxxx xxxxxxxxxx
18 2020-04-13T10:09:08Z xxxxxxxxx xxxxxxxxxx
19 2020-04-13T10:09:08Z xxxxxxxxx xxxxxxxxxx
20 2020-04-13T10:09:08Z xxxxxxxxx xxxxxxxxxx
21 2020-04-13T10:09:08Z xxxxxxxxxxxxxx xxxxxxxxxx

View file

@ -0,0 +1,15 @@
Time Series Data Export
The Time Series export provides a detailed timeline of your tracked activity.
Files Included:
----------
activity_level_YYYY-MM-DD.csv - Where YYYY-MM-DD is the starting date for the entries in the file.
Activity level categorizes how active one is during a certain time interval.
Each entry has the following values:
timestamp - Date and time at which the entry was logged.
level - Label that categorizes the activity level. Values can be SEDENTARY, LIGHTLY_ACTIVE, MODERATELY_ACTIVE, VERY_ACTIVE.
data source - The origin or source of this data.

View file

@ -0,0 +1,21 @@
timestamp,temperature celsius,data source
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
2020-04-13T10:09:08Z,11.1,xxxxxxxxx
1 timestamp temperature celsius data source
2 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
3 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
4 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
5 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
6 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
7 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
8 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
9 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
10 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
11 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
12 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
13 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
14 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
15 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
16 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
17 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
18 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
19 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
20 2020-04-13T10:09:08Z 11.1 xxxxxxxxx
21 2020-04-13T10:09:08Z 11.1 xxxxxxxxx

View file

@ -0,0 +1,14 @@
Time Series Data Export
The Time Series export provides a detailed timeline of your tracked activity.
Files Included:
----------
body_temperature_YYYY-MM-DD.csv - Where YYYY-MM-DD is the starting date for the entries in the file.
Each entry has the following values:
timestamp - Date and time at which the entry was logged.
temperature celsius - The body temperature in Celsius.
data source - The origin or source of this data.

View file

@ -0,0 +1,21 @@
timestamp,calories,data source
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
2020-04-13T10:09:08Z,1.11,xxxxxxxxxx
1 timestamp calories data source
2 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
3 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
4 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
5 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
6 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
7 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
8 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
9 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
10 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
11 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
12 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
13 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
14 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
15 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
16 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
17 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
18 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
19 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
20 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx
21 2020-04-13T10:09:08Z 1.11 xxxxxxxxxx

View file

@ -0,0 +1,21 @@
timestamp,heart rate zone type,kcal,data source
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,1.11111,xxxxxxxxxx
1 timestamp heart rate zone type kcal data source
2 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
3 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
4 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
5 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
6 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
7 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
8 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
9 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
10 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
11 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
12 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
13 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
14 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
15 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
16 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
17 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
18 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
19 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
20 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx
21 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 1.11111 xxxxxxxxxx

View file

@ -0,0 +1,15 @@
Time Series Data Export
The Time Series export provides a detailed timeline of your tracked activity.
Files Included:
----------
calories_in_heart_rate_zone_YYYY-MM-DD.csv - Where YYYY-MM-DD is the starting date for the entries in the file.
Each entry has the following values:
timestamp - Date and time at which the entry was logged.
heart rate zone type - The heart rate zone type the calories were burned in.
kcal - Amount of calories burned in kilocalories.
data source - The origin or source of this data.

View file

@ -0,0 +1,14 @@
Time Series Data Export
The Time Series export provides a detailed timeline of your tracked activity.
Files Included:
----------
calories_YYYY-MM-DD.csv - Where YYYY-MM-DD is the starting date for the entries in the file.
Each entry has the following values:
timestamp - Date and time at which the entry was logged.
calories - Number of calories burned.
data source - The origin or source of this data.

View file

@ -0,0 +1,21 @@
timestamp,ratio,label,data source
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.111111111111111,xxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.11111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.11111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.11111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.111111111111111,xxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.111111111111111,xxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13,1.1111111111111111,xxxxxxxxxxxxxx,xxxxxxxxxx
1 timestamp ratio label data source
2 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
3 2020-04-13 1.111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
4 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
5 2020-04-13 1.1111111111111111 xxxxxxxxxxxxx xxxxxxxxxx
6 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
7 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxxxx xxxxxxxxxx
8 2020-04-13 1.111111111111111 xxxxxxxxxxxxxxxx xxxxxxxxxx
9 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxxxx xxxxxxxxxx
10 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxxxx xxxxxxxxxx
11 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
12 2020-04-13 1.11111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
13 2020-04-13 1.11111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
14 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
15 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
16 2020-04-13 1.11111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
17 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx
18 2020-04-13 1.111111111111111 xxxxxxxxxxxxxxxx xxxxxxxxxx
19 2020-04-13 1.111111111111111 xxxxxxxxxxxxxxxx xxxxxxxxxx
20 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxxxx xxxxxxxxxx
21 2020-04-13 1.1111111111111111 xxxxxxxxxxxxxx xxxxxxxxxx

View file

@ -0,0 +1,16 @@
Time Series Data Export
The Time Series export provides a detailed timeline of your tracked activity.
Files Included:
----------
cardio_acute_chronic_workload_ratio.csv - Contains all data for this type.
Cardio acute chronic workload ratio represents how recent load compares with longer term regular load. Used to determine if the user is over or under-training.
Each entry has the following values:
timestamp - Date at which the entry was logged in UTC.
ratio - Cardio acute chronic workload ratio value.
label - Label interpreting the ratio value. Values can be UNDER_TRAINING, OPTIMAL_TRAINING or OVER_TRAINING.
data source - The origin or source of this data.

View file

@ -0,0 +1,21 @@
timestamp,workout,background,total,data source
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.111111111111111,1.111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.11111111111111111,1.11111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.11111111111111111,1.11111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.11111111111111111,1.11111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.11111111111111111,1.11111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.11111111111111111,1.11111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.11111111111111111,1.11111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.11111111111111111,1.11111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
2020-04-13T10:09:08Z,1.1,1.1111111111111111,1.1111111111111111,xxxxxxxxxx
1 timestamp workout background total data source
2 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
3 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
4 2020-04-13T10:09:08Z 1.1 1.111111111111111 1.111111111111111 xxxxxxxxxx
5 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
6 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
7 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
8 2020-04-13T10:09:08Z 1.1 1.11111111111111111 1.11111111111111111 xxxxxxxxxx
9 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
10 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
11 2020-04-13T10:09:08Z 1.1 1.11111111111111111 1.11111111111111111 xxxxxxxxxx
12 2020-04-13T10:09:08Z 1.1 1.11111111111111111 1.11111111111111111 xxxxxxxxxx
13 2020-04-13T10:09:08Z 1.1 1.11111111111111111 1.11111111111111111 xxxxxxxxxx
14 2020-04-13T10:09:08Z 1.1 1.11111111111111111 1.11111111111111111 xxxxxxxxxx
15 2020-04-13T10:09:08Z 1.1 1.11111111111111111 1.11111111111111111 xxxxxxxxxx
16 2020-04-13T10:09:08Z 1.1 1.11111111111111111 1.11111111111111111 xxxxxxxxxx
17 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
18 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
19 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
20 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx
21 2020-04-13T10:09:08Z 1.1 1.1111111111111111 1.1111111111111111 xxxxxxxxxx

View file

@ -0,0 +1,21 @@
timestamp,min observed load,max observed load,data source
2020-04-13,1.1,11.1,xxxxxxxxxx
2020-04-13,1.1,11.1,xxxxxxxxxx
2020-04-13,1.1,11.11111111111111,xxxxxxxxxx
2020-04-13,1.1,11.11111111111111,xxxxxxxxxx
2020-04-13,1.1,11.11111111111111,xxxxxxxxxx
2020-04-13,1.1,11.11111111111111,xxxxxxxxxx
2020-04-13,1.1,11.11111111111111,xxxxxxxxxx
2020-04-13,1.1,11.11111111111111,xxxxxxxxxx
2020-04-13,1.1111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.1111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.1111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.1111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.1111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.1111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.1111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.1111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.1111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.1111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.111111111111111,11.111111111111111,xxxxxxxxxx
2020-04-13,1.111111111111111,11.11111111111111,xxxxxxxxxx
1 timestamp min observed load max observed load data source
2 2020-04-13 1.1 11.1 xxxxxxxxxx
3 2020-04-13 1.1 11.1 xxxxxxxxxx
4 2020-04-13 1.1 11.11111111111111 xxxxxxxxxx
5 2020-04-13 1.1 11.11111111111111 xxxxxxxxxx
6 2020-04-13 1.1 11.11111111111111 xxxxxxxxxx
7 2020-04-13 1.1 11.11111111111111 xxxxxxxxxx
8 2020-04-13 1.1 11.11111111111111 xxxxxxxxxx
9 2020-04-13 1.1 11.11111111111111 xxxxxxxxxx
10 2020-04-13 1.1111111111111111 11.111111111111111 xxxxxxxxxx
11 2020-04-13 1.1111111111111111 11.111111111111111 xxxxxxxxxx
12 2020-04-13 1.1111111111111111 11.111111111111111 xxxxxxxxxx
13 2020-04-13 1.1111111111111111 11.111111111111111 xxxxxxxxxx
14 2020-04-13 1.1111111111111111 11.111111111111111 xxxxxxxxxx
15 2020-04-13 1.1111111111111111 11.111111111111111 xxxxxxxxxx
16 2020-04-13 1.1111111111111111 11.111111111111111 xxxxxxxxxx
17 2020-04-13 1.1111111111111111 11.111111111111111 xxxxxxxxxx
18 2020-04-13 1.1111111111111111 11.111111111111111 xxxxxxxxxx
19 2020-04-13 1.1111111111111111 11.111111111111111 xxxxxxxxxx
20 2020-04-13 1.111111111111111 11.111111111111111 xxxxxxxxxx
21 2020-04-13 1.111111111111111 11.11111111111111 xxxxxxxxxx

View file

@ -0,0 +1,16 @@
Time Series Data Export
The Time Series export provides a detailed timeline of your tracked activity.
Files Included:
----------
cardio_load_observed_interval.csv - Contains all data for this type.
Cardio load observed interval measures the personalized cardio load interval for a user.
Each entry has the following values:
timestamp - Date at which the entry was logged in UTC.
min observed load - Average of top 3 lowest cardio load values over a period of 4 weeks.
max observed load - average of top 3 highest cardio load values over a period of 4 weeks.
data source - The origin or source of this data.

View file

@ -0,0 +1,17 @@
Time Series Data Export
The Time Series export provides a detailed timeline of your tracked activity.
Files Included:
----------
cardio_load_YYYY-MM-DD.csv - Where YYYY-MM-DD is the starting date for the entries in the file.
Cardio load also known as cardio exertion. Indicates the load on the cardiovascular system.
Each entry has the following values:
timestamp - Date and time at which the entry was logged.
workout - Cardio load accrued during workouts.
background - Cardio load accrued outside of workouts.
total - Total cardio load, sum of workout and background cardio load.
data source - The origin or source of this data.

View file

@ -0,0 +1,21 @@
timestamp,average heart rate variability milliseconds,non rem heart rate beats per minute,entropy,deep sleep root mean square of successive differences milliseconds,data source
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.11,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.11,xxxxxxxxxx
2020-04-13T10:09:08Z,11.11,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.11,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.111,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.11,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.11,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.11,xxxxxxxxxx
2020-04-13T10:09:08Z,11.11,11.1,1.111,11.111,xxxxxxxxxx
2020-04-13T10:09:08Z,11.11,11.1,1.11,11.1,xxxxxxxxxx
2020-04-13T10:09:08Z,11.1,11.1,1.111,11.1,xxxxxxxxxx
1 timestamp average heart rate variability milliseconds non rem heart rate beats per minute entropy deep sleep root mean square of successive differences milliseconds data source
2 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.11 xxxxxxxxxx
3 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.11 xxxxxxxxxx
4 2020-04-13T10:09:08Z 11.11 11.1 1.111 11.1 xxxxxxxxxx
5 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.1 xxxxxxxxxx
6 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.1 xxxxxxxxxx
7 2020-04-13T10:09:08Z 11.11 11.1 1.111 11.1 xxxxxxxxxx
8 2020-04-13T10:09:08Z 11.111 11.1 1.111 11.1 xxxxxxxxxx
9 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.11 xxxxxxxxxx
10 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.11 xxxxxxxxxx
11 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.1 xxxxxxxxxx
12 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.1 xxxxxxxxxx
13 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.1 xxxxxxxxxx
14 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.1 xxxxxxxxxx
15 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.1 xxxxxxxxxx
16 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.1 xxxxxxxxxx
17 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.1 xxxxxxxxxx
18 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.11 xxxxxxxxxx
19 2020-04-13T10:09:08Z 11.11 11.1 1.111 11.111 xxxxxxxxxx
20 2020-04-13T10:09:08Z 11.11 11.1 1.11 11.1 xxxxxxxxxx
21 2020-04-13T10:09:08Z 11.1 11.1 1.111 11.1 xxxxxxxxxx

View file

@ -0,0 +1,17 @@
Time Series Data Export
The Time Series export provides a detailed timeline of your tracked activity.
Files Included:
----------
daily_heart_rate_variability.csv - Contains all data for this type.
Each entry has the following values:
timestamp - Date and time at which the entry was logged.
average heart rate variability milliseconds - The average of a user's heart rate variability during sleep. Heart rate variability is calculated as the root mean square of successive differences (RMSSD) of heartbeat intervals.
non rem heart rate beats per minute - Non-REM heart rate
entropy - The Shanon entropy of heartbeat intervals.
deep sleep root mean square of successive differences milliseconds - The root mean square of successive differences (RMSSD) of heartbeat intervals during deep sleep.
data source - The origin or source of this data.

View file

@ -0,0 +1,21 @@
timestamp,heart_rate_zone,data source
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
2020-04-13T10:09:08Z,xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx,xxxxxxxxxx
1 timestamp heart_rate_zone data source
2 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
3 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
4 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
5 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
6 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
7 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
8 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
9 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
10 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
11 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
12 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
13 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
14 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
15 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
16 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
17 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
18 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
19 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
20 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx
21 2020-04-13T10:09:08Z xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxx

View file

@ -0,0 +1,14 @@
Time Series Data Export
The Time Series export provides a detailed timeline of your tracked activity.
Files Included:
----------
daily_heart_rate_zones.csv - Contains all data for this type.
Each entry has the following values:
timestamp - Date and time at which the entry was logged.
heart_rate_zone - Heart rate zone information.
data source - The origin or source of this data.

Some files were not shown because too many files have changed in this diff Show more