22
22
[ ![ actions] ( https://github.com/open-mmlab/mmpose/workflows/build/badge.svg )] ( https://github.com/open-mmlab/mmpose/actions )
23
23
[ ![ codecov] ( https://codecov.io/gh/open-mmlab/mmpose/branch/latest/graph/badge.svg )] ( https://codecov.io/gh/open-mmlab/mmpose )
24
24
[ ![ PyPI] ( https://img.shields.io/pypi/v/mmpose )] ( https://pypi.org/project/mmpose/ )
25
- [ ![ LICENSE] ( https://img.shields.io/github/license/open-mmlab/mmpose.svg )] ( https://github.com/open-mmlab/mmpose/blob/master /LICENSE )
25
+ [ ![ LICENSE] ( https://img.shields.io/github/license/open-mmlab/mmpose.svg )] ( https://github.com/open-mmlab/mmpose/blob/main /LICENSE )
26
26
[ ![ Average time to resolve an issue] ( https://isitmaintained.com/badge/resolution/open-mmlab/mmpose.svg )] ( https://github.com/open-mmlab/mmpose/issues )
27
27
[ ![ Percentage of issues still open] ( https://isitmaintained.com/badge/open/open-mmlab/mmpose.svg )] ( https://github.com/open-mmlab/mmpose/issues )
28
28
@@ -63,7 +63,7 @@ English | [简体中文](README_CN.md)
63
63
MMPose is an open-source toolbox for pose estimation based on PyTorch.
64
64
It is a part of the [ OpenMMLab project] ( https://github.com/open-mmlab ) .
65
65
66
- The master branch works with ** PyTorch 1.8+** .
66
+ The main branch works with ** PyTorch 1.8+** .
67
67
68
68
https://user-images.githubusercontent.com/15977946/124654387-0fd3c500-ded1-11eb-84f6-24eeddbf4d91.mp4
69
69
@@ -97,9 +97,12 @@ https://user-images.githubusercontent.com/15977946/124654387-0fd3c500-ded1-11eb-
97
97
98
98
## What's New
99
99
100
- - We are excited to release ** YOLOX-Pose** , a One-Stage multi-person pose estimation model based on YOLOX. Checkout our [ project page] ( /projects/yolox-pose/ ) for more details.
100
+ - We are glad to support 3 new datasets:
101
+ - (CVPR 2023) [ Human-Art] ( https://github.com/IDEA-Research/HumanArt )
102
+ - (CVPR 2022) [ Animal Kingdom] ( https://github.com/sutdcv/Animal-Kingdom )
103
+ - (AAAI 2020) [ LaPa] ( https://github.com/JDAI-CV/lapa-dataset/ )
101
104
102
- ![ yolox-pose_intro ] ( https://user-images.githubusercontent. com/26127467/226655503-3cee746e-6e42-40be-82ae-6e7cae2a4c7e.jpg )
105
+ ![ image ] ( https://github. com/open-mmlab/mmpose/assets/13503330/c9171dbb-7e7a-4c39-98e3-c92932182efb )
103
106
104
107
- Welcome to [ * projects of MMPose* ] ( /projects/README.md ) , where you can access to the latest features of MMPose, and share your ideas and codes with the community at once. Contribution to MMPose will be simple and smooth:
105
108
@@ -108,20 +111,22 @@ https://user-images.githubusercontent.com/15977946/124654387-0fd3c500-ded1-11eb-
108
111
- Build individual projects with full power of MMPose but not bound up with heavy frameworks
109
112
- Checkout new projects:
110
113
- [ RTMPose] ( /projects/rtmpose/ )
111
- - [ YOLOX-Pose] ( /projects/yolox-pose / )
114
+ - [ YOLOX-Pose] ( /projects/yolox_pose / )
112
115
- [ MMPose4AIGC] ( /projects/mmpose4aigc/ )
116
+ - [ Simple Keypoints] ( /projects/skps/ )
113
117
- Become a contributors and make MMPose greater. Start your journey from the [ example project] ( /projects/example_project/ )
114
118
115
119
<br />
116
120
117
- - 2022-04-06 : MMPose [ v1.0 .0] ( https://github.com/open-mmlab/mmpose/releases/tag/v1.0 .0 ) is officially released, with the main updates including:
121
+ - 2023-07-04 : MMPose [ v1.1 .0] ( https://github.com/open-mmlab/mmpose/releases/tag/v1.1 .0 ) is officially released, with the main updates including:
118
122
119
- - Release of [ YOLOX-Pose] ( /projects/yolox-pose/ ) , a One-Stage multi-person pose estimation model based on YOLOX
120
- - Development of [ MMPose for AIGC] ( /projects/mmpose4aigc/ ) based on RTMPose, generating high-quality skeleton images for Pose-guided AIGC projects
121
- - Support for OpenPose-style skeleton visualization
122
- - More complete and user-friendly [ documentation and tutorials] ( https://mmpose.readthedocs.io/en/latest/overview.html )
123
+ - Support new datasets: Human-Art, Animal Kingdom and LaPa.
124
+ - Support new config type that is more user-friendly and flexible.
125
+ - Improve RTMPose with better performance.
126
+ - Migrate 3D pose estimation models on h36m.
127
+ - Inference speedup and webcam inference with all demo scripts.
123
128
124
- Please refer to the [ release notes] ( https://github.com/open-mmlab/mmpose/releases/tag/v1.0 .0 ) for more updates brought by MMPose v1.0 .0!
129
+ Please refer to the [ release notes] ( https://github.com/open-mmlab/mmpose/releases/tag/v1.1 .0 ) for more updates brought by MMPose v1.1 .0!
125
130
126
131
## 0.x / 1.x Migration
127
132
@@ -139,18 +144,18 @@ MMPose v1.0.0 is a major update, including many API and config file changes. Cur
139
144
| HigherHRNet (CVPR 2020) | |
140
145
| DeepPose (CVPR 2014) | done |
141
146
| RLE (ICCV 2021) | done |
142
- | SoftWingloss (TIP 2021) | |
143
- | VideoPose3D (CVPR 2019) | in progress |
147
+ | SoftWingloss (TIP 2021) | done |
148
+ | VideoPose3D (CVPR 2019) | done |
144
149
| Hourglass (ECCV 2016) | done |
145
150
| LiteHRNet (CVPR 2021) | done |
146
151
| AdaptiveWingloss (ICCV 2019) | done |
147
152
| SimpleBaseline2D (ECCV 2018) | done |
148
153
| PoseWarper (NeurIPS 2019) | |
149
- | SimpleBaseline3D (ICCV 2017) | in progress |
154
+ | SimpleBaseline3D (ICCV 2017) | done |
150
155
| HMR (CVPR 2018) | |
151
156
| UDP (CVPR 2020) | done |
152
157
| VIPNAS (CVPR 2021) | done |
153
- | Wingloss (CVPR 2018) | |
158
+ | Wingloss (CVPR 2018) | done |
154
159
| DarkPose (CVPR 2020) | done |
155
160
| Associative Embedding (NIPS 2017) | in progress |
156
161
| VoxelPose (ECCV 2020) | |
@@ -214,13 +219,13 @@ A summary can be found in the [Model Zoo](https://mmpose.readthedocs.io/en/lates
214
219
- [x] [ DeepPose] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/algorithms.html#deeppose-cvpr-2014 ) (CVPR'2014)
215
220
- [x] [ CPM] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/backbones.html#cpm-cvpr-2016 ) (CVPR'2016)
216
221
- [x] [ Hourglass] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/backbones.html#hourglass-eccv-2016 ) (ECCV'2016)
217
- - [ ] [ SimpleBaseline3D] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/algorithms.html#simplebaseline3d-iccv-2017 ) (ICCV'2017)
222
+ - [x ] [ SimpleBaseline3D] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/algorithms.html#simplebaseline3d-iccv-2017 ) (ICCV'2017)
218
223
- [ ] [ Associative Embedding] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/algorithms.html#associative-embedding-nips-2017 ) (NeurIPS'2017)
219
224
- [x] [ SimpleBaseline2D] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/algorithms.html#simplebaseline2d-eccv-2018 ) (ECCV'2018)
220
225
- [x] [ DSNT] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/algorithms.html#dsnt-2018 ) (ArXiv'2021)
221
226
- [x] [ HRNet] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/backbones.html#hrnet-cvpr-2019 ) (CVPR'2019)
222
227
- [x] [ IPR] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/algorithms.html#ipr-eccv-2018 ) (ECCV'2018)
223
- - [ ] [ VideoPose3D] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/algorithms.html#videopose3d-cvpr-2019 ) (CVPR'2019)
228
+ - [x ] [ VideoPose3D] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/algorithms.html#videopose3d-cvpr-2019 ) (CVPR'2019)
224
229
- [x] [ HRNetv2] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/backbones.html#hrnetv2-tpami-2019 ) (TPAMI'2019)
225
230
- [x] [ MSPN] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/backbones.html#mspn-arxiv-2019 ) (ArXiv'2019)
226
231
- [x] [ SCNet] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/backbones.html#scnet-cvpr-2020 ) (CVPR'2020)
@@ -238,14 +243,14 @@ A summary can be found in the [Model Zoo](https://mmpose.readthedocs.io/en/lates
238
243
<details close >
239
244
<summary ><b >Supported techniques:</b ></summary >
240
245
241
- - [ ] [ FPN] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#fpn-cvpr-2017 ) (CVPR'2017)
242
- - [ ] [ FP16] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#fp16-arxiv-2017 ) (ArXiv'2017)
243
- - [ ] [ Wingloss] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#wingloss-cvpr-2018 ) (CVPR'2018)
244
- - [ ] [ AdaptiveWingloss] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#adaptivewingloss-iccv-2019 ) (ICCV'2019)
246
+ - [x ] [ FPN] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#fpn-cvpr-2017 ) (CVPR'2017)
247
+ - [x ] [ FP16] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#fp16-arxiv-2017 ) (ArXiv'2017)
248
+ - [x ] [ Wingloss] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#wingloss-cvpr-2018 ) (CVPR'2018)
249
+ - [x ] [ AdaptiveWingloss] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#adaptivewingloss-iccv-2019 ) (ICCV'2019)
245
250
- [x] [ DarkPose] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#darkpose-cvpr-2020 ) (CVPR'2020)
246
251
- [x] [ UDP] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#udp-cvpr-2020 ) (CVPR'2020)
247
- - [ ] [ Albumentations] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#albumentations-information-2020 ) (Information'2020)
248
- - [ ] [ SoftWingloss] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#softwingloss-tip-2021 ) (TIP'2021)
252
+ - [x ] [ Albumentations] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#albumentations-information-2020 ) (Information'2020)
253
+ - [x ] [ SoftWingloss] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#softwingloss-tip-2021 ) (TIP'2021)
249
254
- [x] [ RLE] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/techniques.html#rle-iccv-2021 ) (ICCV'2021)
250
255
251
256
</details >
@@ -284,6 +289,8 @@ A summary can be found in the [Model Zoo](https://mmpose.readthedocs.io/en/lates
284
289
- [x] [ InterHand2.6M] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/datasets.html#interhand2-6m-eccv-2020 ) \[ [ homepage] ( https://mks0601.github.io/InterHand2.6M/ ) \] (ECCV'2020)
285
290
- [x] [ AP-10K] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/datasets.html#ap-10k-neurips-2021 ) \[ [ homepage] ( https://github.com/AlexTheBad/AP-10K ) \] (NeurIPS'2021)
286
291
- [x] [ Horse-10] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/datasets.html#horse-10-wacv-2021 ) \[ [ homepage] ( http://www.mackenziemathislab.org/horse10 ) \] (WACV'2021)
292
+ - [x] [ Human-Art] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/datasets.html#human-art-cvpr-2023 ) \[ [ homepage] ( https://idea-research.github.io/HumanArt/ ) \] (CVPR'2023)
293
+ - [x] [ LaPa] ( https://mmpose.readthedocs.io/en/latest/model_zoo_papers/datasets.html#lapa-aaai-2020 ) \[ [ homepage] ( https://github.com/JDAI-CV/lapa-dataset ) \] (AAAI'2020)
287
294
288
295
</details >
289
296
@@ -309,7 +316,7 @@ A summary can be found in the [Model Zoo](https://mmpose.readthedocs.io/en/lates
309
316
310
317
### Model Request
311
318
312
- We will keep up with the latest progress of the community, and support more popular algorithms and frameworks. If you have any feature requests, please feel free to leave a comment in [ MMPose Roadmap] ( https://github.com/open-mmlab/mmpose/issues/9 ) .
319
+ We will keep up with the latest progress of the community, and support more popular algorithms and frameworks. If you have any feature requests, please feel free to leave a comment in [ MMPose Roadmap] ( https://github.com/open-mmlab/mmpose/issues/2258 ) .
313
320
314
321
## Contributing
315
322
0 commit comments