Intro

Hi there!

As I mentioned in my previous post - Latest goodies October #1 - there were a few interesting presentations at OBTS v8.0 which motivated me to learn more about macOS and *OS platforms.
One of the most interesting ones was Sarah Edwards's - The Power of Powerlogs. All presentations, slides and videos are on OBTS - Talks page.
I was aware of macOS' powerlogs (most probably from Sarah's blog), however I never spent time investigating them in detail and learning about their structure. This time it seems like a good opportunity to do so.

So what are powerlogs?

Sarah's description on OBTS - Talks page is great. But what does Apple actually say about them? I am going to check the documentation!
macos-powerlogs-1-1
Ah, it must be that I need to search for powerlogs!? But no, I still see the same "No results were found. Please try a different keyword."

So probably Sarah's description is also most accurate, as Apple's docs are empty. I tried searching in a few other places (Google Search, Perplexity AI, ask an LLM etc.) but no official documentation or resources were found. If you know of any, please let me know!

For now, let's focus on understanding the structure of powerlogs and how they can be used for DFIR purposes.

How to collect powerlogs?

The method that I used is well known and possibly the easiest. Just run:

sudo sysdiagnose

After a few minutes, you should see a compressed archive similar to sysdiagnose_2025.10.29_22-27-18+0100_macOS_Mac_25A362.tar.gz.
And in the archive are a multitude of files and directories, including powerlogs.

ls sysdiagnose_2025.10.29_22-27-18+0100_macOS_Mac_25A362
Accessibility                                filecoordination.txt                         nclist.txt                                   smcDiagnose.txt
acdiagnose-501.txt                           FileProvider                                 network-info                                 spindump.txt
apfs_stats.txt                               find-system-migration-history.txt            nfsstat.txt                                  summaries
<snip>...
</snip>
diskutil_info.txt                            logs                                         sample-24312-highcpu.txt                     var_run_resolv.conf
diskutil_list.txt                            lsappinfo.txt                                sample-24323-highcpu.txt                     vm_stat.txt
diskutil_listClients.txt                     lsregister-0.csstoredump                     securebootvariables.txt                      WiFi
diskutil_listSnapshot.txt                    lsregister-501.csstoredump                   security-sysdiagnose.txt                     WindowServer.external.winfo.plist
efi-dump-logs.txt                            mddiagnose.mdsdiagnostic                     sfltool.LSSharedFileList.FavoriteItems.txt   xartutil.txt
error_log.txt                                microstackshots                              sfltool.LSSharedFileList.FavoriteVolumes.txt zprint.txt
errors                                       mount.txt                                    sfltool.LSSharedFileList.iCloudItems.txt

What we are looking for should be in the logs directory:

ls sysdiagnose_2025.10.29_22-27-18+0100_macOS_Mac_25A362/logs/powerlogs
log_2025-10-29_22-28_C487FA34.EPSQL      powerlog_2025-10-29_22-28_053DFF48.PLSQL

Powerlogs content

Now that we have the logs, we can take a look at them and analyze for anything valuable, be it
for forensic purposes, security investigations or just curiosity.

File extensions are quite suggestive of how the data is structured, nevertheless let's check the file type as identified by the file command:

file powerlogs/log_2025-10-29_22-28_C487FA34.EPSQL
powerlogs/log_2025-10-29_22-28_C487FA34.EPSQL: SQLite 3.x database, last written using SQLite version 3051000, file counter 245, database pages 200, cookie 0x21, schema 4, largest root page 21, UTF-8, vacuum mode 1, version-valid-for 245

file powerlogs/powerlog_2025-10-29_22-28_053DFF48.PLSQL
powerlogs/powerlog_2025-10-29_22-28_053DFF48.PLSQL: SQLite 3.x database, last written using SQLite version 3051000, file counter 3282, database pages 2045, cookie 0x5f1, schema 4, largest root page 373, UTF-8, vacuum mode 1, version-valid-for 3282

Side note: I was curious what is the difference between EPSQL and PLSQL file extensions, but I couldn't find a clear explanation which I would confident about. This might become clear as we explore the content of these files.

One of my favorite tools to explore SQLite databases is DB Browser for SQLite. Another way would be to use the eminent sqlite3 command-line tool:

sqlite3 powerlogs/log_2025-10-29_22-28_C487FA34.EPSQL

SQLite version 3.51.0 2025-06-12 13:14:41
Enter ".help" for usage hints.
sqlite> .stats
Memory Used:                         176072 (max 195832) bytes
Number of Outstanding Allocations:   411 (max 422)
Number of Pcache Overflow Bytes:     5312 (max 5312) bytes
Largest Allocation:                  122400 bytes
Largest Pcache Allocation:           4104 bytes
Lookaside Slots Used:                55 (max 143)
Successful lookaside attempts:       710
Lookaside failures due to size:      34
Lookaside failures due to OOM:       0
Pager Heap Usage:                    18688 bytes
Page cache hits:                     8
Page cache misses:                   3
Page cache writes:                   0
Page cache spills:                   0
Schema Heap Usage:                   8696 bytes
Statement Heap/Lookaside Usage:      0 bytes
sqlite> .tables
BatteryDataCollection_BDC_Daily_1095_1
BatteryDataCollection_BDC_OBC_365_1
BatteryDataCollection_BDC_Once_3650_1
BatteryDataCollection_BDC_SBC_365_1
BatteryDataCollection_BDC_SmartCharging_365_1
BatteryDataCollection_BDC_Weekly_1095_1
BatteryMetrics_BatteryConfig_1_1
BatteryTrustedData_Daily_365_1
PLCoreStorage_Metadata
PLCoreStorage_MetadataVersion
PLCoreStorage_Metadata_Dynamic
PPTStorageOperator_Config_1095_1
PPTStorageOperator_Configuration_1095_1
PPTStorageOperator_Locale_1095_1
PPTStorageOperator_Locale_30_1
PPTStorageOperator_TimeOffset_1095_1

While exploring the data, it is important to note that the number of tables and the data will highly depend on the specific device and its usage patterns. So, the tables names that you see mentioned here aren't exhaustive.
For the specific case of powerlogs collected on my device here are the number of tables in each database:

sqlite3 powerlogs/log_2025-10-29_22-28_C487FA34.EPSQL ".tables" | wc -l
      16

sqlite3 powerlogs/powerlog_2025-10-29_22-28_053DFF48.PLSQL ".tables" | wc -l
     344

Analyzing 344 tables manually is a tedious task. Besides the time and effort required, it is necessary to document well and understand the data structure. And then, as mentioned before, the logs are device and platform specific, influenced by user behavior and that means that analyzing another set of logs will require almost the same amount of effort. Why not automate the process?

I started writing a few small Python scripts to aid in the analysis of the powerlogs and speed up routine tasks.

Pawalogs toolset

The toolset is work in progress and I will add new features as I go along analyzing the data and understanding the structure of the logs. As I gain more insights, I will update the toolset to provide more comprehensive analysis and visualization capabilities.

As of now, there are 3 scripts:

  • Database Schema Inspector: Reads a SQLite database file and extracts table names and schemas.
  • Field Analyzer: Analyzes database schemas to find identical or similar fields across tables
    using Claude AI in headless mode.
  • Table Counts: Counts the number of entries in each table of a SQLite database.

If you're interested in how they work, I suggest taking a look at the source code in pawalogs. I choose not to go into details about the implementation as it is still a work in progress and things might change, thus information posted here will become outdated.

Wrap up

I will try and keep each post concise and easy to digest. I hope this will serve as a useful resource for my future self and anyone just starting out with powerlogs analysis.
In the next post, I will explore the data of specific tables and their relationships. Plus, provide some more analysis tools.

Till next time!