Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code:   DatasetGenerationCastError
Exception:    DatasetGenerationCastError
Message:      An error occurred while generating the dataset

All the data files must have the same columns, but at some point there are 9 new columns ({'emg_3', 'emg_7', 'emg_5', 'emg_8', 'emg_2', 'time', 'emg_6', 'emg_1', 'emg_4'}) and 15 missing columns ({'end_sec', 'description', 'volunteer', 'recording_id', 'primitive', 'description_variants', 'hand', 'start_sec', 'timestamp', 'duration_sec', 'scenario', 'object', 'is_right_hand', 'is_left_hand', 'segment_idx'}).

This happened while the csv dataset builder was generating data using

hf://datasets/lllwwwlll/PULSE-sample/data/v1/s1/aligned_emg_100hz.csv (at revision 97d478676a946fef358f5add3b7a4c0bdd67c784), [/tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/annotations_flat/segments.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/annotations_flat/segments.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_emg_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_emg_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_eyetrack_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_eyetrack_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_imu_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_imu_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_mocap_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_mocap_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_myo_pose_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_myo_pose_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_myo_quat_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_myo_quat_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_pressure_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_pressure_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/raw/aligned_v1s1_s_Q.tsv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/raw/aligned_v1s1_s_Q.tsv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/metadata/recordings.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/metadata/recordings.csv)]

Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1800, in _prepare_split_single
                  writer.write_table(table)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 765, in write_table
                  self._write_table(pa_table, writer_batch_size=writer_batch_size)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 773, in _write_table
                  pa_table = table_cast(pa_table, self._schema)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2321, in table_cast
                  return cast_table_to_schema(table, schema)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2249, in cast_table_to_schema
                  raise CastError(
              datasets.table.CastError: Couldn't cast
              time: double
              emg_1: double
              emg_2: double
              emg_3: double
              emg_4: double
              emg_5: double
              emg_6: double
              emg_7: double
              emg_8: double
              -- schema metadata --
              pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 1260
              to
              {'volunteer': Value('string'), 'scenario': Value('string'), 'recording_id': Value('string'), 'segment_idx': Value('int64'), 'timestamp': Value('string'), 'start_sec': Value('int64'), 'end_sec': Value('int64'), 'duration_sec': Value('int64'), 'primitive': Value('string'), 'hand': Value('string'), 'is_left_hand': Value('int64'), 'is_right_hand': Value('int64'), 'object': Value('string'), 'description': Value('string'), 'description_variants': Value('string')}
              because column names don't match
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1347, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 980, in convert_to_parquet
                  builder.download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 882, in download_and_prepare
                  self._download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 943, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1646, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1802, in _prepare_split_single
                  raise DatasetGenerationCastError.from_cast_error(
              datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset
              
              All the data files must have the same columns, but at some point there are 9 new columns ({'emg_3', 'emg_7', 'emg_5', 'emg_8', 'emg_2', 'time', 'emg_6', 'emg_1', 'emg_4'}) and 15 missing columns ({'end_sec', 'description', 'volunteer', 'recording_id', 'primitive', 'description_variants', 'hand', 'start_sec', 'timestamp', 'duration_sec', 'scenario', 'object', 'is_right_hand', 'is_left_hand', 'segment_idx'}).
              
              This happened while the csv dataset builder was generating data using
              
              hf://datasets/lllwwwlll/PULSE-sample/data/v1/s1/aligned_emg_100hz.csv (at revision 97d478676a946fef358f5add3b7a4c0bdd67c784), [/tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/annotations_flat/segments.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/annotations_flat/segments.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_emg_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_emg_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_eyetrack_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_eyetrack_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_imu_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_imu_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_mocap_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_mocap_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_myo_pose_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_myo_pose_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_myo_quat_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_myo_quat_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_pressure_100hz.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/aligned_pressure_100hz.csv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/raw/aligned_v1s1_s_Q.tsv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/data/v1/s1/raw/aligned_v1s1_s_Q.tsv), /tmp/hf-datasets-cache/medium/datasets/61751364951481-config-parquet-and-info-lllwwwlll-PULSE-sample-4a248529/hub/datasets--lllwwwlll--PULSE-sample/snapshots/97d478676a946fef358f5add3b7a4c0bdd67c784/metadata/recordings.csv (origin=hf://datasets/lllwwwlll/PULSE-sample@97d478676a946fef358f5add3b7a4c0bdd67c784/metadata/recordings.csv)]
              
              Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

volunteer
string
scenario
string
recording_id
string
segment_idx
int64
timestamp
string
start_sec
int64
end_sec
int64
duration_sec
int64
primitive
string
hand
string
is_left_hand
int64
is_right_hand
int64
object
string
description
string
description_variants
string
v1
s1
v1s1
0
00:03-00:05
3
5
2
pull
right
0
1
laptop power adapter
Pull laptop power adapter with right hand.
["Pull laptop power adapter with right hand.", "Right hand pulls laptop power adapter.", "Pulling laptop power adapter with right hand.", "Laptop power adapter is pulled by right hand."]
v1
s1
v1s1
1
00:05-00:07
5
7
2
pull
right
0
1
laptop power adapter
Pull laptop power adapter with right hand.
["Pull laptop power adapter with right hand.", "Right hand pulls laptop power adapter.", "Pulling laptop power adapter with right hand.", "Laptop power adapter is pulled by right hand."]
v1
s1
v1s1
2
00:10-00:12
10
12
2
pick_up
both
1
1
paper
Pick up paper with both hands.
["Pick up paper with both hands.", "Both hands pick up paper.", "Picking up paper with both hands.", "Paper is picked up by both hands."]
v1
s1
v1s1
3
00:13-00:15
13
15
2
grasp
both
1
1
laptop
Grasp laptop with both hands.
["Grasp laptop with both hands.", "Both hands grasp laptop.", "Grasping laptop with both hands.", "Laptop is grasped by both hands."]
v1
s1
v1s1
4
00:15-00:17
15
17
2
move
both
1
1
laptop
Move laptop with both hands.
["Move laptop with both hands.", "Both hands move laptop.", "Moving laptop with both hands.", "Laptop is moved by both hands."]
v1
s1
v1s1
5
00:18-00:20
18
20
2
close
both
1
1
laptop
Close laptop with both hands.
["Close laptop with both hands.", "Both hands close laptop.", "Closing laptop with both hands.", "Laptop is closed by both hands."]
v1
s1
v1s1
6
00:24-00:26
24
26
2
grasp
both
1
1
tape
Grasp tape with both hands.
["Grasp tape with both hands.", "Both hands grasp tape.", "Grasping tape with both hands.", "Tape is grasped by both hands."]
v1
s1
v1s1
7
00:26-00:27
26
27
1
put_down
both
1
1
tape
Put down tape with both hands.
["Put down tape with both hands.", "Both hands put down tape.", "Putting down tape with both hands.", "Tape is put down by both hands."]
v1
s1
v1s1
8
00:27-00:29
27
29
2
pick_up
left
1
0
marker
Pick up marker with left hand.
["Pick up marker with left hand.", "Left hand picks up marker.", "Picking up marker with left hand.", "Marker is picked up by left hand."]
v1
s1
v1s1
9
00:30-00:32
30
32
2
place
left
1
0
marker
Place marker with left hand.
["Place marker with left hand.", "Left hand places marker.", "Placing marker with left hand.", "Marker is placed by left hand."]
v1
s1
v1s1
10
00:33-00:35
33
35
2
place
left
1
0
marker
Place marker with left hand.
["Place marker with left hand.", "Left hand places marker.", "Placing marker with left hand.", "Marker is placed by left hand."]
v1
s1
v1s1
11
00:36-00:38
36
38
2
place
right
0
1
marker
Place marker with right hand.
["Place marker with right hand.", "Right hand places marker.", "Placing marker with right hand.", "Marker is placed by right hand."]
v1
s1
v1s1
12
00:39-00:41
39
41
2
pick_up
both
1
1
wallet
Pick up wallet with both hands.
["Pick up wallet with both hands.", "Both hands pick up wallet.", "Picking up wallet with both hands.", "Wallet is picked up by both hands."]
v1
s1
v1s1
13
00:41-00:43
41
43
2
move
left
1
0
wallet
Move wallet with left hand.
["Move wallet with left hand.", "Left hand moves wallet.", "Moving wallet with left hand.", "Wallet is moved by left hand."]
v1
s1
v1s1
14
00:44-00:46
44
46
2
pick_up
right
0
1
laptop power adapter
Pick up laptop power adapter with right hand.
["Pick up laptop power adapter with right hand.", "Right hand picks up laptop power adapter.", "Picking up laptop power adapter with right hand.", "Laptop power adapter is picked up by right hand."]
v1
s1
v1s1
15
00:46-00:48
46
48
2
adjust
both
1
1
laptop power adapter
Adjust laptop power adapter with both hands.
["Adjust laptop power adapter with both hands.", "Both hands adjust laptop power adapter.", "Adjusting laptop power adapter with both hands.", "Laptop power adapter is adjusted by both hands."]
v1
s1
v1s1
16
00:49-00:51
49
51
2
hold
both
1
1
laptop power adapter
Hold laptop power adapter with both hands.
["Hold laptop power adapter with both hands.", "Both hands hold laptop power adapter.", "Holding laptop power adapter with both hands.", "Laptop power adapter is held by both hands."]
v1
s1
v1s1
17
00:52-00:54
52
54
2
put_down
both
1
1
laptop power adapter
Put down laptop power adapter with both hands.
["Put down laptop power adapter with both hands.", "Both hands put down laptop power adapter.", "Putting down laptop power adapter with both hands.", "Laptop power adapter is put down by both hands."]
v1
s1
v1s1
18
00:56-00:58
56
58
2
grasp
both
1
1
wired mouse
Grasp wired mouse with both hands.
["Grasp wired mouse with both hands.", "Both hands grasp wired mouse.", "Grasping wired mouse with both hands.", "Wired mouse is grasped by both hands."]
v1
s1
v1s1
19
00:59-01:01
59
61
2
adjust
both
1
1
wired mouse
Adjust wired mouse with both hands.
["Adjust wired mouse with both hands.", "Both hands adjust wired mouse.", "Adjusting wired mouse with both hands.", "Wired mouse is adjusted by both hands."]
v1
s1
v1s1
20
01:02-01:04
62
64
2
put_down
right
0
1
wired mouse
Put down wired mouse with right hand.
["Put down wired mouse with right hand.", "Right hand puts down wired mouse.", "Putting down wired mouse with right hand.", "Wired mouse is put down by right hand."]
v1
s1
v1s1
21
01:08-01:10
68
70
2
remove
left
1
0
laptop power adapter
Remove laptop power adapter with left hand.
["Remove laptop power adapter with left hand.", "Left hand removes laptop power adapter.", "Removing laptop power adapter with left hand.", "Laptop power adapter is removed by left hand."]
v1
s1
v1s1
22
01:12-01:14
72
74
2
pick_up
right
0
1
stapler
Pick up stapler with right hand.
["Pick up stapler with right hand.", "Right hand picks up stapler.", "Picking up stapler with right hand.", "Stapler is picked up by right hand."]
v1
s1
v1s1
23
01:15-01:17
75
77
2
put_down
right
0
1
stapler
Put down stapler with right hand.
["Put down stapler with right hand.", "Right hand puts down stapler.", "Putting down stapler with right hand.", "Stapler is put down by right hand."]
v1
s1
v1s1
24
01:18-01:20
78
80
2
grasp
both
1
1
laptop power adapter
Grasp laptop power adapter with both hands.
["Grasp laptop power adapter with both hands.", "Both hands grasp laptop power adapter.", "Grasping laptop power adapter with both hands.", "Laptop power adapter is grasped by both hands."]
v1
s1
v1s1
25
01:21-01:23
81
83
2
adjust
both
1
1
laptop power adapter
Adjust laptop power adapter with both hands.
["Adjust laptop power adapter with both hands.", "Both hands adjust laptop power adapter.", "Adjusting laptop power adapter with both hands.", "Laptop power adapter is adjusted by both hands."]
v1
s1
v1s1
26
01:24-01:26
84
86
2
put_down
left
1
0
laptop power adapter
Put down laptop power adapter with left hand.
["Put down laptop power adapter with left hand.", "Left hand puts down laptop power adapter.", "Putting down laptop power adapter with left hand.", "Laptop power adapter is put down by left hand."]
v1
s1
v1s1
27
01:27-01:29
87
89
2
grasp
left
1
0
paper
Grasp paper with left hand.
["Grasp paper with left hand.", "Left hand grasps paper.", "Grasping paper with left hand.", "Paper is grasped by left hand."]
v1
s1
v1s1
28
01:30-01:32
90
92
2
pick_up
both
1
1
paper
Pick up paper with both hands.
["Pick up paper with both hands.", "Both hands pick up paper.", "Picking up paper with both hands.", "Paper is picked up by both hands."]
v1
s1
v1s1
29
01:33-01:35
93
95
2
put_down
both
1
1
paper
Put down paper with both hands.
["Put down paper with both hands.", "Both hands put down paper.", "Putting down paper with both hands.", "Paper is put down by both hands."]
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
null
End of preview.

PULSE-sample — Representative Single-Recording Subset

This is a small representative subset of the PULSE dataset, hosted as a separate Hugging Face repository so that NeurIPS 2026 reviewers can inspect data quality and schema without downloading the full 85 GB release.

Full dataset: lllwwwlll/PULSE

What this sample contains (~284 MB total)

A single complete recording — v1/s1 (volunteer 1, scenario S1 "Office desk organization", ~101 s) — with all five non-visual sensor modalities plus the synchronized scene-camera video, action-segment annotations, and the global metadata files needed to interpret everything.

PULSE-sample/
├── data/v1/s1/
│   ├── aligned_emg_100hz.csv             # 8-channel surface EMG @ 100 Hz
│   ├── aligned_eyetrack_100hz.csv        # 24-dim binocular gaze @ 100 Hz
│   ├── aligned_imu_100hz.csv             # 160-dim wearable IMU @ 100 Hz
│   ├── aligned_mocap_100hz.csv           # 56-joint optical motion capture @ 100 Hz
│   ├── aligned_pressure_100hz.csv        # 50-channel fingertip pressure @ 100 Hz
│   ├── aligned_myo_pose_100hz.csv        # forearm pose (auxiliary)
│   ├── aligned_myo_quat_100hz.csv        # forearm orientation (auxiliary)
│   ├── alignment_metadata.json           # per-recording sync diagnostics
│   ├── raw/                              # raw Qualisys MoCap stream (.tsv)
│   └── videos/                           # scene-cam + gaze-overlay (.mp4, 25 fps)
├── annotations/v1/s1.json                # dense segment annotations (action / hand / object / text)
├── annotations_flat/segments.csv         # the 30 segments of v1/s1, flattened
├── metadata/recordings.csv               # full 337-row recording manifest
├── metadata/modality_coverage.xlsx       # per-recording modality availability
├── LICENSE                               # CC BY-NC 4.0 (data)
└── CODE_LICENSE                          # MIT (code in companion repo)

How to use this sample

import pandas as pd

# Load all five modalities for the single recording
ROOT = "data/v1/s1"
emg      = pd.read_csv(f"{ROOT}/aligned_emg_100hz.csv")
eyetrack = pd.read_csv(f"{ROOT}/aligned_eyetrack_100hz.csv")
imu      = pd.read_csv(f"{ROOT}/aligned_imu_100hz.csv")
mocap    = pd.read_csv(f"{ROOT}/aligned_mocap_100hz.csv")
pressure = pd.read_csv(f"{ROOT}/aligned_pressure_100hz.csv")
print(f"Aligned shapes (T, D): {[x.shape for x in [emg, eyetrack, imu, mocap, pressure]]}")

# Load the dense segment annotations
import json
with open("annotations/v1/s1.json") as f: ann = json.load(f)
print(f"{len(ann['segments'])} action segments")

All time series are sub-frame aligned (<10 ms) on a shared 100 Hz timebase. The first sample of every modality file corresponds to t = 0 of the trimmed scene-cam video; total length matches metadata/recordings.csv row v1s1 (duration_sec, n_samples_100hz).

How this sample was created

Selected by the dataset authors as a representative recording: scenario S1 "office desk organization" was chosen because it contains a typical mix of grasp / move / place / adjust primitives without unusually short or long sub-tasks; v1 was chosen because it has all five modalities present and full-length scene-cam video.

The full 337-row metadata/recordings.csv is included so reviewers can see exactly where this recording sits in the train/test split scheme and which other recordings exist; the global Croissant metadata is on the main repo.

License & attribution

Data is released under CC BY-NC 4.0. By accessing PULSE-sample you agree to the license, including the prohibition on commercial redeployment, re-identification, and worker-surveillance applications. See LICENSE for the full terms. Companion code is released under MIT (see CODE_LICENSE).

Citation

@inproceedings{anonymous2026pulse,
  title     = {PULSE: A Synchronized Five-Modality Dataset for Multi-Modal Daily Activity Understanding},
  author    = {Anonymous Authors},
  booktitle = {Submitted to NeurIPS 2026 Datasets and Benchmarks Track},
  year      = {2026},
  note      = {Under double-blind review}
}
Downloads last month
20