Chromium Code Reviews| OLD | NEW |
|---|---|
| (Empty) | |
| 1 // Copyright 2017 The Chromium Authors. All rights reserved. | |
| 2 // Use of this source code is governed by a BSD-style license that can be | |
| 3 // found in the LICENSE file. | |
| 4 | |
| 5 // Private API for receiving real-time media perception information. | |
| 6 [platforms=("chromeos")] | |
| 7 namespace mediaPerceptionPrivate { | |
| 8 enum Status { | |
| 9 // Media analytics process waiting to be launched. | |
| 10 UNINITIALIZED, | |
| 11 | |
| 12 // Analytics process running and media processing pipeline started, | |
|
Devlin
2017/05/17 20:22:42
nit: use complete sentences in comments, so someth
Luke Sorenson
2017/05/17 21:01:36
Done.
| |
| 13 // but it is not yet receiving image frames. This is a transitional state | |
| 14 // between SUSPENDED and RUNNING for the time it takes to warm up the | |
| 15 // media processing pipeline, which can take anywhere from a few seconds | |
| 16 // to a minute. | |
| 17 // Note: STARTED is the initial reply to SetState RUNNING. | |
| 18 STARTED, | |
| 19 | |
| 20 // Analytics process running and media processing pipeling injesting image | |
| 21 // frames. At this point, MediaPerception signals should be coming over | |
| 22 // D-Bus. | |
| 23 RUNNING, | |
| 24 | |
| 25 // Analytics process is running and ready to be set to state RUNNING. | |
| 26 // The D-Bus communications are enabled but the media processing pipeline | |
| 27 // is suspended. | |
| 28 SUSPENDED | |
| 29 }; | |
| 30 | |
| 31 // The system and configuration state of the analytics process and v4lplugin. | |
| 32 dictionary State { | |
| 33 Status status; | |
| 34 | |
| 35 // Optional $(ref:setState) parameter. Specifies the video device the media | |
| 36 // analytics process should open while the media processing pipeline is | |
| 37 // starting. To set this parameter, status has to be RUNNING. | |
| 38 DOMString? deviceContext; | |
| 39 }; | |
| 40 | |
| 41 dictionary Point { | |
| 42 // The horizontal distance from the top left corner of the image. | |
| 43 double? x; | |
|
Devlin
2017/05/17 20:22:42
When would these be null?
Luke Sorenson
2017/05/17 21:01:36
Previously discussed the reason for these null val
Devlin
2017/05/18 21:21:24
That's a bit weird, but since this is a private AP
Luke Sorenson
2017/05/18 21:56:33
Acknowledged.
| |
| 44 | |
| 45 // The vertical distance from the top left corner of the image. | |
| 46 double? y; | |
| 47 }; | |
| 48 | |
| 49 dictionary BoundingBox { | |
| 50 // Specifies whether the points are normalized to the size of the image. | |
| 51 boolean? normalized; | |
| 52 | |
| 53 // The two points that define the corners of a bounding box. | |
| 54 Point? topLeft; | |
| 55 Point? bottomRight; | |
| 56 }; | |
| 57 | |
| 58 enum EntityType { | |
| 59 UNSPECIFIED, | |
| 60 FACE, | |
| 61 PERSON | |
| 62 }; | |
| 63 | |
| 64 dictionary Entity { | |
| 65 // A unique id associated with the detected entity, which can be used to | |
| 66 // track the entity over time. | |
| 67 long? id; | |
|
Devlin
2017/05/17 20:22:41
Why would these be null?
I'm not familiar enough
Luke Sorenson
2017/05/17 21:01:36
See above explanation.
| |
| 68 | |
| 69 EntityType? type; | |
| 70 | |
| 71 // Minimum box which captures entire detected entity. | |
| 72 BoundingBox? boundingBox; | |
| 73 | |
| 74 // A value for the quality of this detection. | |
| 75 double? confidence; | |
| 76 | |
| 77 // The estimated depth of the entity from the camera. | |
| 78 double? depthInMeters; | |
| 79 }; | |
| 80 | |
| 81 // The set of computer vision metadata for an image frame. | |
| 82 dictionary FramePerception { | |
| 83 long? frameId; | |
| 84 | |
| 85 long? frameWidthInPx; | |
| 86 long? frameHeightInPx; | |
| 87 | |
| 88 // The timestamp associated with the frame (when its recieved by the | |
| 89 // analytics process). | |
| 90 double? timestamp; | |
| 91 | |
| 92 // The list of entities detected in this frame. | |
| 93 Entity[]? entities; | |
|
Devlin
2017/05/17 20:22:41
Usually prefer an empty list rather than an option
Luke Sorenson
2017/05/17 21:01:36
Above explanation. Modeled after proto behavior.
| |
| 94 }; | |
| 95 | |
| 96 dictionary MediaPerception { | |
| 97 // The time the media perception data was emitted by the media processing | |
| 98 // pipeline. This value will be greater than the timestamp stored within | |
| 99 // the FramePerception dictionary and the difference between them can be | |
| 100 // viewed as the processing time for a single frame. | |
| 101 double? timestamp; | |
| 102 | |
| 103 // An array of framePerceptions. | |
| 104 FramePerception[]? framePerceptions; | |
| 105 }; | |
| 106 | |
| 107 enum ImageFormat { | |
| 108 UNSPECIFIED, | |
| 109 RGB, | |
|
Devlin
2017/05/17 20:22:41
Is RGB an image format?
Luke Sorenson
2017/05/17 21:01:36
Based on media_perception.proto, but this could be
Devlin
2017/05/18 21:21:24
RAW makes a bit more sense to me. Comments would
Luke Sorenson
2017/05/18 21:56:33
Done.
| |
| 110 PNG, | |
| 111 JPEG | |
| 112 }; | |
| 113 | |
| 114 dictionary ImageFrame { | |
| 115 long? width; | |
| 116 long? height; | |
| 117 | |
| 118 ImageFormat? format; | |
| 119 | |
| 120 long? dataLength; | |
| 121 | |
| 122 // The bytes of the image frame. | |
| 123 ArrayBuffer? frame; | |
| 124 }; | |
| 125 | |
| 126 dictionary PerceptionSample { | |
| 127 // The video analytics FramePerception for the associated image frame | |
| 128 // data. | |
| 129 FramePerception? framePerception; | |
| 130 | |
| 131 // The image frame data for the associated FramePerception object. | |
| 132 ImageFrame? imageFrame; | |
| 133 }; | |
| 134 | |
| 135 dictionary Diagnostics { | |
| 136 // A buffer of image frames and the associated video analytics information | |
| 137 // that can be used to diagnose a malfunction. | |
| 138 PerceptionSample[]? perceptionSamples; | |
| 139 }; | |
| 140 | |
| 141 callback StateCallback = void(State state); | |
| 142 | |
| 143 callback DiagnosticsCallback = void(Diagnostics diagnostics); | |
| 144 | |
| 145 interface Functions { | |
| 146 // Get the status of the media perception process. | |
|
Devlin
2017/05/17 20:22:42
Function comments should be descriptive, not imper
Luke Sorenson
2017/05/17 21:01:36
Done.
| |
| 147 // |callback| : The current state of the system. | |
| 148 static void getState(StateCallback callback); | |
| 149 | |
| 150 // Set the desired state of the system. | |
| 151 // |state| : A dictionary with the desired new state. The only settable | |
| 152 // states are RUNNING and SUSPENDED. | |
| 153 // |callback| : The State of the system after setting it. Can be used to | |
|
Devlin
2017/05/17 20:22:42
The callback isn't the state of the system; it's i
Luke Sorenson
2017/05/17 21:01:36
We do indicate failure via an error, however there
Devlin
2017/05/18 21:21:24
When is it the case that calling setState() not se
Luke Sorenson
2017/05/18 21:56:33
For simplicity, the app can only setState RUNNING
Devlin
2017/05/19 15:08:12
Is there a reason those cases shouldn't be surface
Luke Sorenson
2017/05/19 15:51:04
So it's never the case currently that you need to
Devlin
2017/05/19 15:58:17
Just because we only respond with one doesn't mean
| |
| 154 // verify the state was set as desired. | |
| 155 static void setState(State state, StateCallback callback); | |
| 156 | |
| 157 // Get a diagnostics buffer out of the video analytics process. | |
| 158 // |callback| : Returns a Diagnostics dictionary object. | |
| 159 static void getDiagnostics(DiagnosticsCallback callback); | |
| 160 }; | |
| 161 | |
| 162 interface Events { | |
| 163 // Fired when media perception information is received from the media | |
| 164 // analytics process. | |
| 165 // |mediaPerception| : The dictionary which contains a dump of everything | |
| 166 // the analytics process has detected or determined from the incoming media | |
| 167 // streams. | |
| 168 static void onMediaPerception(MediaPerception mediaPerception); | |
| 169 }; | |
| 170 }; | |
| OLD | NEW |