mirror of
https://github.com/yt-dlp/yt-dlp
synced 2025-12-16 22:25:40 +07:00
Compare commits
2 Commits
2023.03.04
...
2022.01.21
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b54e1255ce | ||
|
|
f20d607b0e |
@@ -1,8 +0,0 @@
|
||||
root = true
|
||||
|
||||
[**.py]
|
||||
charset = utf-8
|
||||
indent_size = 4
|
||||
indent_style = space
|
||||
trim_trailing_whitespace = true
|
||||
insert_final_newline = true
|
||||
2
.gitattributes
vendored
2
.gitattributes
vendored
@@ -2,5 +2,3 @@
|
||||
|
||||
Makefile* text whitespace=-tab-in-indent
|
||||
*.sh text eol=lf
|
||||
*.md diff=markdown
|
||||
*.py diff=python
|
||||
|
||||
71
.github/ISSUE_TEMPLATE/1_broken_site.yml
vendored
71
.github/ISSUE_TEMPLATE/1_broken_site.yml
vendored
@@ -1,14 +1,7 @@
|
||||
name: Broken site
|
||||
description: Report error in a supported site
|
||||
name: Broken site support
|
||||
description: Report broken or misfunctioning site
|
||||
labels: [triage, site-bug]
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -16,15 +9,15 @@ body:
|
||||
description: |
|
||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||
options:
|
||||
- label: I'm reporting that a **supported** site is broken
|
||||
- label: I'm reporting a broken site
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I'm running yt-dlp version **2022.01.21**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
- label: I've checked that all provided URLs are alive and playable in a browser
|
||||
required: true
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
@@ -33,47 +26,37 @@ body:
|
||||
id: region
|
||||
attributes:
|
||||
label: Region
|
||||
description: Enter the country/region that the site is accessible from
|
||||
placeholder: India
|
||||
description: "Enter the region the site is accessible from"
|
||||
placeholder: "India"
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Provide a description that is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
label: Description
|
||||
description: |
|
||||
Provide an explanation of your issue in an arbitrary form.
|
||||
Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
placeholder: WRITE DESCRIPTION HERE
|
||||
validations:
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: verbose
|
||||
attributes:
|
||||
label: Provide verbose output that clearly demonstrates the problem
|
||||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
required: true
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
required: true
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Complete Verbose Output
|
||||
label: Verbose log
|
||||
description: |
|
||||
It should start like this:
|
||||
Provide the complete verbose output of yt-dlp **that clearly demonstrates the problem**.
|
||||
Add the `-vU` flag to your command line you run yt-dlp with (`yt-dlp -vU <your command line>`), copy the WHOLE output and insert it below.
|
||||
It should look similar to this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Command-line config: ['-vU', 'http://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Portable config file: yt-dlp.conf
|
||||
[debug] Portable config: ['-i']
|
||||
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
|
||||
[debug] yt-dlp version 2022.01.21 (exe)
|
||||
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
|
||||
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
|
||||
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2023.03.04, Current version: 2023.03.04
|
||||
yt-dlp is up to date (2023.03.04)
|
||||
yt-dlp is up to date (2022.01.21)
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
|
||||
@@ -2,13 +2,6 @@ name: Site support request
|
||||
description: Request support for a new site
|
||||
labels: [triage, site-request]
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -18,13 +11,13 @@ body:
|
||||
options:
|
||||
- label: I'm reporting a new site support request
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I'm running yt-dlp version **2022.01.21**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
- label: I've checked that all provided URLs are alive and playable in a browser
|
||||
required: true
|
||||
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
||||
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
@@ -33,8 +26,8 @@ body:
|
||||
id: region
|
||||
attributes:
|
||||
label: Region
|
||||
description: Enter the country/region that the site is accessible from
|
||||
placeholder: India
|
||||
description: "Enter the region the site is accessible from"
|
||||
placeholder: "India"
|
||||
- type: textarea
|
||||
id: example-urls
|
||||
attributes:
|
||||
@@ -50,42 +43,31 @@ body:
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Provide a description that is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
label: Description
|
||||
description: |
|
||||
Provide any additional information
|
||||
placeholder: WRITE DESCRIPTION HERE
|
||||
validations:
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: verbose
|
||||
attributes:
|
||||
label: Provide verbose output that clearly demonstrates the problem
|
||||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
required: true
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
required: true
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Complete Verbose Output
|
||||
label: Verbose log
|
||||
description: |
|
||||
It should start like this:
|
||||
Provide the complete verbose output **using one of the example URLs provided above**.
|
||||
Add the `-vU` flag to your command line you run yt-dlp with (`yt-dlp -vU <your command line>`), copy the WHOLE output and insert it below.
|
||||
It should look similar to this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Command-line config: ['-vU', 'http://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Portable config file: yt-dlp.conf
|
||||
[debug] Portable config: ['-i']
|
||||
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
|
||||
[debug] yt-dlp version 2022.01.21 (exe)
|
||||
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
|
||||
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
|
||||
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2023.03.04, Current version: 2023.03.04
|
||||
yt-dlp is up to date (2023.03.04)
|
||||
yt-dlp is up to date (2022.01.21)
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
|
||||
@@ -2,13 +2,6 @@ name: Site feature request
|
||||
description: Request a new functionality for a supported site
|
||||
labels: [triage, site-enhancement]
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -16,13 +9,13 @@ body:
|
||||
description: |
|
||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||
options:
|
||||
- label: I'm requesting a site-specific feature
|
||||
- label: I'm reporting a site feature request
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I'm running yt-dlp version **2022.01.21**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
- label: I've checked that all provided URLs are alive and playable in a browser
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
@@ -31,8 +24,8 @@ body:
|
||||
id: region
|
||||
attributes:
|
||||
label: Region
|
||||
description: Enter the country/region that the site is accessible from
|
||||
placeholder: India
|
||||
description: "Enter the region the site is accessible from"
|
||||
placeholder: "India"
|
||||
- type: textarea
|
||||
id: example-urls
|
||||
attributes:
|
||||
@@ -46,42 +39,33 @@ body:
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Provide a description that is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
label: Description
|
||||
description: |
|
||||
Provide an explanation of your site feature request in an arbitrary form.
|
||||
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
|
||||
Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
placeholder: WRITE DESCRIPTION HERE
|
||||
validations:
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: verbose
|
||||
attributes:
|
||||
label: Provide verbose output that clearly demonstrates the problem
|
||||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
required: true
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
required: true
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Complete Verbose Output
|
||||
label: Verbose log
|
||||
description: |
|
||||
It should start like this:
|
||||
Provide the complete verbose output of yt-dlp that demonstrates the need for the enhancement.
|
||||
Add the `-vU` flag to your command line you run yt-dlp with (`yt-dlp -vU <your command line>`), copy the WHOLE output and insert it below.
|
||||
It should look similar to this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Command-line config: ['-vU', 'http://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Portable config file: yt-dlp.conf
|
||||
[debug] Portable config: ['-i']
|
||||
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
|
||||
[debug] yt-dlp version 2022.01.21 (exe)
|
||||
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
|
||||
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
|
||||
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2023.03.04, Current version: 2023.03.04
|
||||
yt-dlp is up to date (2023.03.04)
|
||||
yt-dlp is up to date (2022.01.21)
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
|
||||
62
.github/ISSUE_TEMPLATE/4_bug_report.yml
vendored
62
.github/ISSUE_TEMPLATE/4_bug_report.yml
vendored
@@ -2,13 +2,6 @@ name: Bug report
|
||||
description: Report a bug unrelated to any particular site or extractor
|
||||
labels: [triage, bug]
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -18,55 +11,46 @@ body:
|
||||
options:
|
||||
- label: I'm reporting a bug unrelated to a specific site
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I'm running yt-dlp version **2022.01.21**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
- label: I've checked that all provided URLs are alive and playable in a browser
|
||||
required: true
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Provide a description that is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
label: Description
|
||||
description: |
|
||||
Provide an explanation of your issue in an arbitrary form.
|
||||
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
|
||||
Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
placeholder: WRITE DESCRIPTION HERE
|
||||
validations:
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: verbose
|
||||
attributes:
|
||||
label: Provide verbose output that clearly demonstrates the problem
|
||||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
required: true
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
required: true
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Complete Verbose Output
|
||||
label: Verbose log
|
||||
description: |
|
||||
It should start like this:
|
||||
Provide the complete verbose output of yt-dlp **that clearly demonstrates the problem**.
|
||||
Add the `-vU` flag to **your** command line you run yt-dlp with (`yt-dlp -vU <your command line>`), copy the WHOLE output and insert it below.
|
||||
It should look similar to this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Command-line config: ['-vU', 'http://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Portable config file: yt-dlp.conf
|
||||
[debug] Portable config: ['-i']
|
||||
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
|
||||
[debug] yt-dlp version 2022.01.21 (exe)
|
||||
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
|
||||
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
|
||||
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2023.03.04, Current version: 2023.03.04
|
||||
yt-dlp is up to date (2023.03.04)
|
||||
yt-dlp is up to date (2022.01.21)
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
|
||||
55
.github/ISSUE_TEMPLATE/5_feature_request.yml
vendored
55
.github/ISSUE_TEMPLATE/5_feature_request.yml
vendored
@@ -2,13 +2,6 @@ name: Feature request
|
||||
description: Request a new functionality unrelated to any particular site or extractor
|
||||
labels: [triage, enhancement]
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -16,52 +9,22 @@ body:
|
||||
description: |
|
||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||
options:
|
||||
- label: I'm requesting a feature unrelated to a specific site
|
||||
- label: I'm reporting a feature request
|
||||
required: true
|
||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||
- label: I've verified that I'm running yt-dlp version **2022.01.21**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Provide a description that is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
label: Description
|
||||
description: |
|
||||
Provide an explanation of your site feature request in an arbitrary form.
|
||||
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
|
||||
Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
placeholder: WRITE DESCRIPTION HERE
|
||||
validations:
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: verbose
|
||||
attributes:
|
||||
label: Provide verbose output that clearly demonstrates the problem
|
||||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Complete Verbose Output
|
||||
description: |
|
||||
It should start like this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2023.03.04, Current version: 2023.03.04
|
||||
yt-dlp is up to date (2023.03.04)
|
||||
<more lines>
|
||||
render: shell
|
||||
|
||||
67
.github/ISSUE_TEMPLATE/6_question.yml
vendored
67
.github/ISSUE_TEMPLATE/6_question.yml
vendored
@@ -2,19 +2,6 @@ name: Ask question
|
||||
description: Ask yt-dlp related question
|
||||
labels: [question]
|
||||
body:
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
### Make sure you are **only** asking a question and not reporting a bug or requesting a feature.
|
||||
If your question contains "isn't working" or "can you add", this is most likely the wrong template.
|
||||
If you are in doubt whether this is the right template, **USE ANOTHER TEMPLATE**!
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -22,52 +9,44 @@ body:
|
||||
description: |
|
||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||
options:
|
||||
- label: I'm asking a question and **not** reporting a bug or requesting a feature
|
||||
- label: I'm asking a question and **not** reporting a bug/feature request
|
||||
required: true
|
||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions including closed ones
|
||||
required: true
|
||||
- type: textarea
|
||||
id: question
|
||||
attributes:
|
||||
label: Please make sure the question is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information and as much context and examples as possible
|
||||
label: Question
|
||||
description: |
|
||||
Ask your question in an arbitrary form.
|
||||
Please make sure it's worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
|
||||
Provide any additional information and as much context and examples as possible.
|
||||
If your question contains "isn't working" or "can you add", this is most likely the wrong template
|
||||
placeholder: WRITE QUESTION HERE
|
||||
validations:
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: verbose
|
||||
attributes:
|
||||
label: Provide verbose output that clearly demonstrates the problem
|
||||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Complete Verbose Output
|
||||
label: Verbose log
|
||||
description: |
|
||||
It should start like this:
|
||||
If your question involes a yt-dlp command, provide the complete verbose output of that command.
|
||||
Add the `-vU` flag to **your** command line you run yt-dlp with (`yt-dlp -vU <your command line>`), copy the WHOLE output and insert it below.
|
||||
It should look similar to this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Command-line config: ['-vU', 'http://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Portable config file: yt-dlp.conf
|
||||
[debug] Portable config: ['-i']
|
||||
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
|
||||
[debug] yt-dlp version 2021.12.01 (exe)
|
||||
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
|
||||
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
|
||||
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2023.03.04, Current version: 2023.03.04
|
||||
yt-dlp is up to date (2023.03.04)
|
||||
yt-dlp is up to date (2021.12.01)
|
||||
<more lines>
|
||||
render: shell
|
||||
|
||||
3
.github/ISSUE_TEMPLATE/config.yml
vendored
3
.github/ISSUE_TEMPLATE/config.yml
vendored
@@ -3,6 +3,3 @@ contact_links:
|
||||
- name: Get help from the community on Discord
|
||||
url: https://discord.gg/H5MNcFW63r
|
||||
about: Join the yt-dlp Discord for community-powered support!
|
||||
- name: Matrix Bridge to the Discord server
|
||||
url: https://matrix.to/#/#yt-dlp:matrix.org
|
||||
about: For those who do not want to use Discord
|
||||
|
||||
51
.github/ISSUE_TEMPLATE_tmpl/1_broken_site.yml
vendored
51
.github/ISSUE_TEMPLATE_tmpl/1_broken_site.yml
vendored
@@ -1,8 +1,7 @@
|
||||
name: Broken site
|
||||
description: Report error in a supported site
|
||||
name: Broken site support
|
||||
description: Report broken or misfunctioning site
|
||||
labels: [triage, site-bug]
|
||||
body:
|
||||
%(no_skip)s
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -10,15 +9,15 @@ body:
|
||||
description: |
|
||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||
options:
|
||||
- label: I'm reporting that a **supported** site is broken
|
||||
- label: I'm reporting a broken site
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
- label: I've checked that all provided URLs are alive and playable in a browser
|
||||
required: true
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
@@ -27,14 +26,38 @@ body:
|
||||
id: region
|
||||
attributes:
|
||||
label: Region
|
||||
description: Enter the country/region that the site is accessible from
|
||||
placeholder: India
|
||||
description: "Enter the region the site is accessible from"
|
||||
placeholder: "India"
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Provide a description that is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
label: Description
|
||||
description: |
|
||||
Provide an explanation of your issue in an arbitrary form.
|
||||
Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
placeholder: WRITE DESCRIPTION HERE
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Verbose log
|
||||
description: |
|
||||
Provide the complete verbose output of yt-dlp **that clearly demonstrates the problem**.
|
||||
Add the `-vU` flag to your command line you run yt-dlp with (`yt-dlp -vU <your command line>`), copy the WHOLE output and insert it below.
|
||||
It should look similar to this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'http://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Portable config file: yt-dlp.conf
|
||||
[debug] Portable config: ['-i']
|
||||
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
|
||||
[debug] yt-dlp version %(version)s (exe)
|
||||
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
|
||||
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
|
||||
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
|
||||
[debug] Proxy map: {}
|
||||
yt-dlp is up to date (%(version)s)
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
required: true
|
||||
%(verbose)s
|
||||
|
||||
@@ -2,7 +2,6 @@ name: Site support request
|
||||
description: Request support for a new site
|
||||
labels: [triage, site-request]
|
||||
body:
|
||||
%(no_skip)s
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -12,13 +11,13 @@ body:
|
||||
options:
|
||||
- label: I'm reporting a new site support request
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
- label: I've checked that all provided URLs are alive and playable in a browser
|
||||
required: true
|
||||
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
||||
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
@@ -27,8 +26,8 @@ body:
|
||||
id: region
|
||||
attributes:
|
||||
label: Region
|
||||
description: Enter the country/region that the site is accessible from
|
||||
placeholder: India
|
||||
description: "Enter the region the site is accessible from"
|
||||
placeholder: "India"
|
||||
- type: textarea
|
||||
id: example-urls
|
||||
attributes:
|
||||
@@ -44,9 +43,32 @@ body:
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Provide a description that is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
label: Description
|
||||
description: |
|
||||
Provide any additional information
|
||||
placeholder: WRITE DESCRIPTION HERE
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Verbose log
|
||||
description: |
|
||||
Provide the complete verbose output **using one of the example URLs provided above**.
|
||||
Add the `-vU` flag to your command line you run yt-dlp with (`yt-dlp -vU <your command line>`), copy the WHOLE output and insert it below.
|
||||
It should look similar to this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'http://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Portable config file: yt-dlp.conf
|
||||
[debug] Portable config: ['-i']
|
||||
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
|
||||
[debug] yt-dlp version %(version)s (exe)
|
||||
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
|
||||
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
|
||||
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
|
||||
[debug] Proxy map: {}
|
||||
yt-dlp is up to date (%(version)s)
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
required: true
|
||||
%(verbose)s
|
||||
|
||||
@@ -2,7 +2,6 @@ name: Site feature request
|
||||
description: Request a new functionality for a supported site
|
||||
labels: [triage, site-enhancement]
|
||||
body:
|
||||
%(no_skip)s
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -10,13 +9,13 @@ body:
|
||||
description: |
|
||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||
options:
|
||||
- label: I'm requesting a site-specific feature
|
||||
- label: I'm reporting a site feature request
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
- label: I've checked that all provided URLs are alive and playable in a browser
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
@@ -25,8 +24,8 @@ body:
|
||||
id: region
|
||||
attributes:
|
||||
label: Region
|
||||
description: Enter the country/region that the site is accessible from
|
||||
placeholder: India
|
||||
description: "Enter the region the site is accessible from"
|
||||
placeholder: "India"
|
||||
- type: textarea
|
||||
id: example-urls
|
||||
attributes:
|
||||
@@ -40,9 +39,34 @@ body:
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Provide a description that is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
label: Description
|
||||
description: |
|
||||
Provide an explanation of your site feature request in an arbitrary form.
|
||||
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
|
||||
Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
placeholder: WRITE DESCRIPTION HERE
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Verbose log
|
||||
description: |
|
||||
Provide the complete verbose output of yt-dlp that demonstrates the need for the enhancement.
|
||||
Add the `-vU` flag to your command line you run yt-dlp with (`yt-dlp -vU <your command line>`), copy the WHOLE output and insert it below.
|
||||
It should look similar to this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'http://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Portable config file: yt-dlp.conf
|
||||
[debug] Portable config: ['-i']
|
||||
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
|
||||
[debug] yt-dlp version %(version)s (exe)
|
||||
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
|
||||
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
|
||||
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
|
||||
[debug] Proxy map: {}
|
||||
yt-dlp is up to date (%(version)s)
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
required: true
|
||||
%(verbose)s
|
||||
|
||||
42
.github/ISSUE_TEMPLATE_tmpl/4_bug_report.yml
vendored
42
.github/ISSUE_TEMPLATE_tmpl/4_bug_report.yml
vendored
@@ -2,7 +2,6 @@ name: Bug report
|
||||
description: Report a bug unrelated to any particular site or extractor
|
||||
labels: [triage, bug]
|
||||
body:
|
||||
%(no_skip)s
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -12,22 +11,47 @@ body:
|
||||
options:
|
||||
- label: I'm reporting a bug unrelated to a specific site
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
- label: I've checked that all provided URLs are alive and playable in a browser
|
||||
required: true
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Provide a description that is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
label: Description
|
||||
description: |
|
||||
Provide an explanation of your issue in an arbitrary form.
|
||||
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
|
||||
Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
placeholder: WRITE DESCRIPTION HERE
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Verbose log
|
||||
description: |
|
||||
Provide the complete verbose output of yt-dlp **that clearly demonstrates the problem**.
|
||||
Add the `-vU` flag to **your** command line you run yt-dlp with (`yt-dlp -vU <your command line>`), copy the WHOLE output and insert it below.
|
||||
It should look similar to this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'http://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Portable config file: yt-dlp.conf
|
||||
[debug] Portable config: ['-i']
|
||||
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
|
||||
[debug] yt-dlp version %(version)s (exe)
|
||||
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
|
||||
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
|
||||
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
|
||||
[debug] Proxy map: {}
|
||||
yt-dlp is up to date (%(version)s)
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
required: true
|
||||
%(verbose)s
|
||||
|
||||
@@ -2,7 +2,6 @@ name: Feature request
|
||||
description: Request a new functionality unrelated to any particular site or extractor
|
||||
labels: [triage, enhancement]
|
||||
body:
|
||||
%(no_skip)s
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -10,22 +9,22 @@ body:
|
||||
description: |
|
||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||
options:
|
||||
- label: I'm requesting a feature unrelated to a specific site
|
||||
- label: I'm reporting a feature request
|
||||
required: true
|
||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Provide a description that is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
label: Description
|
||||
description: |
|
||||
Provide an explanation of your site feature request in an arbitrary form.
|
||||
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
|
||||
Provide any additional information, any suggested solutions, and as much context and examples as possible
|
||||
placeholder: WRITE DESCRIPTION HERE
|
||||
validations:
|
||||
required: true
|
||||
%(verbose_optional)s
|
||||
|
||||
47
.github/ISSUE_TEMPLATE_tmpl/6_question.yml
vendored
47
.github/ISSUE_TEMPLATE_tmpl/6_question.yml
vendored
@@ -2,13 +2,6 @@ name: Ask question
|
||||
description: Ask yt-dlp related question
|
||||
labels: [question]
|
||||
body:
|
||||
%(no_skip)s
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
### Make sure you are **only** asking a question and not reporting a bug or requesting a feature.
|
||||
If your question contains "isn't working" or "can you add", this is most likely the wrong template.
|
||||
If you are in doubt whether this is the right template, **USE ANOTHER TEMPLATE**!
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
attributes:
|
||||
@@ -16,22 +9,44 @@ body:
|
||||
description: |
|
||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||
options:
|
||||
- label: I'm asking a question and **not** reporting a bug or requesting a feature
|
||||
- label: I'm asking a question and **not** reporting a bug/feature request
|
||||
required: true
|
||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
required: true
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions including closed ones
|
||||
required: true
|
||||
- type: textarea
|
||||
id: question
|
||||
attributes:
|
||||
label: Please make sure the question is worded well enough to be understood
|
||||
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
|
||||
placeholder: Provide any additional information and as much context and examples as possible
|
||||
label: Question
|
||||
description: |
|
||||
Ask your question in an arbitrary form.
|
||||
Please make sure it's worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
|
||||
Provide any additional information and as much context and examples as possible.
|
||||
If your question contains "isn't working" or "can you add", this is most likely the wrong template
|
||||
placeholder: WRITE QUESTION HERE
|
||||
validations:
|
||||
required: true
|
||||
%(verbose_optional)s
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Verbose log
|
||||
description: |
|
||||
If your question involes a yt-dlp command, provide the complete verbose output of that command.
|
||||
Add the `-vU` flag to **your** command line you run yt-dlp with (`yt-dlp -vU <your command line>`), copy the WHOLE output and insert it below.
|
||||
It should look similar to this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'http://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Portable config file: yt-dlp.conf
|
||||
[debug] Portable config: ['-i']
|
||||
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
|
||||
[debug] yt-dlp version 2021.12.01 (exe)
|
||||
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
|
||||
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
|
||||
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
|
||||
[debug] Proxy map: {}
|
||||
yt-dlp is up to date (2021.12.01)
|
||||
<more lines>
|
||||
render: shell
|
||||
|
||||
43
.github/PULL_REQUEST_TEMPLATE.md
vendored
43
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -1,43 +1,28 @@
|
||||
**IMPORTANT**: PRs without the template will be CLOSED
|
||||
|
||||
### Description of your *pull request* and other information
|
||||
|
||||
<!--
|
||||
|
||||
Explanation of your *pull request* in arbitrary form goes here. Please **make sure the description explains the purpose and effect** of your *pull request* and is worded well enough to be understood. Provide as much **context and examples** as possible
|
||||
|
||||
-->
|
||||
|
||||
ADD DESCRIPTION HERE
|
||||
|
||||
Fixes #
|
||||
|
||||
|
||||
<details open><summary>Template</summary> <!-- OPEN is intentional -->
|
||||
|
||||
<!--
|
||||
|
||||
# PLEASE FOLLOW THE GUIDE BELOW
|
||||
## Please follow the guide below
|
||||
|
||||
- You will be asked some questions, please read them **carefully** and answer honestly
|
||||
- Put an `x` into all the boxes `[ ]` relevant to your *pull request* (like [x])
|
||||
- Put an `x` into all the boxes [ ] relevant to your *pull request* (like that [x])
|
||||
- Use *Preview* tab to see how your *pull request* will actually look like
|
||||
|
||||
-->
|
||||
---
|
||||
|
||||
### Before submitting a *pull request* make sure you have:
|
||||
- [ ] At least skimmed through [contributing guidelines](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#developer-instructions) including [yt-dlp coding conventions](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#yt-dlp-coding-conventions)
|
||||
- [ ] [Searched](https://github.com/yt-dlp/yt-dlp/search?q=is%3Apr&type=Issues) the bugtracker for similar pull requests
|
||||
- [ ] Checked the code with [flake8](https://pypi.python.org/pypi/flake8) and [ran relevant tests](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#developer-instructions)
|
||||
- [ ] Checked the code with [flake8](https://pypi.python.org/pypi/flake8)
|
||||
|
||||
### In order to be accepted and merged into yt-dlp each piece of code must be in public domain or released under [Unlicense](http://unlicense.org/). Check all of the following options that apply:
|
||||
### In order to be accepted and merged into yt-dlp each piece of code must be in public domain or released under [Unlicense](http://unlicense.org/). Check one of the following options:
|
||||
- [ ] I am the original author of this code and I am willing to release it under [Unlicense](http://unlicense.org/)
|
||||
- [ ] I am not the original author of this code but it is in public domain or released under [Unlicense](http://unlicense.org/) (provide reliable evidence)
|
||||
|
||||
### What is the purpose of your *pull request*?
|
||||
- [ ] Fix or improvement to an extractor (Make sure to add/update tests)
|
||||
- [ ] New extractor ([Piracy websites will not be accepted](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy))
|
||||
- [ ] Core bug fix/improvement
|
||||
- [ ] New feature (It is strongly [recommended to open an issue first](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#adding-new-feature-or-making-overarching-changes))
|
||||
- [ ] Bug fix
|
||||
- [ ] Improvement
|
||||
- [ ] New extractor
|
||||
- [ ] New feature
|
||||
|
||||
</details>
|
||||
---
|
||||
|
||||
### Description of your *pull request* and other information
|
||||
|
||||
Explanation of your *pull request* in arbitrary form goes here. Please make sure the description explains the purpose and effect of your *pull request* and is worded well enough to be understood. Provide as much context and examples as possible.
|
||||
|
||||
718
.github/workflows/build.yml
vendored
718
.github/workflows/build.yml
vendored
@@ -1,356 +1,414 @@
|
||||
name: Build Artifacts
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
version:
|
||||
required: true
|
||||
type: string
|
||||
channel:
|
||||
required: false
|
||||
default: stable
|
||||
type: string
|
||||
unix:
|
||||
default: true
|
||||
type: boolean
|
||||
linux_arm:
|
||||
default: true
|
||||
type: boolean
|
||||
macos:
|
||||
default: true
|
||||
type: boolean
|
||||
macos_legacy:
|
||||
default: true
|
||||
type: boolean
|
||||
windows:
|
||||
default: true
|
||||
type: boolean
|
||||
windows32:
|
||||
default: true
|
||||
type: boolean
|
||||
meta_files:
|
||||
default: true
|
||||
type: boolean
|
||||
secrets:
|
||||
GPG_SIGNING_KEY:
|
||||
required: false
|
||||
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
version:
|
||||
description: Version tag (YYYY.MM.DD[.REV])
|
||||
required: true
|
||||
type: string
|
||||
channel:
|
||||
description: Update channel (stable/nightly)
|
||||
required: true
|
||||
default: stable
|
||||
type: string
|
||||
unix:
|
||||
description: yt-dlp, yt-dlp.tar.gz, yt-dlp_linux, yt-dlp_linux.zip
|
||||
default: true
|
||||
type: boolean
|
||||
linux_arm:
|
||||
description: yt-dlp_linux_aarch64, yt-dlp_linux_armv7l
|
||||
default: true
|
||||
type: boolean
|
||||
macos:
|
||||
description: yt-dlp_macos, yt-dlp_macos.zip
|
||||
default: true
|
||||
type: boolean
|
||||
macos_legacy:
|
||||
description: yt-dlp_macos_legacy
|
||||
default: true
|
||||
type: boolean
|
||||
windows:
|
||||
description: yt-dlp.exe, yt-dlp_min.exe, yt-dlp_win.zip
|
||||
default: true
|
||||
type: boolean
|
||||
windows32:
|
||||
description: yt-dlp_x86.exe
|
||||
default: true
|
||||
type: boolean
|
||||
meta_files:
|
||||
description: SHA2-256SUMS, SHA2-512SUMS, _update_spec
|
||||
default: true
|
||||
type: boolean
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
name: Build
|
||||
on: workflow_dispatch
|
||||
|
||||
jobs:
|
||||
unix:
|
||||
if: inputs.unix
|
||||
build_unix:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.10"
|
||||
- uses: conda-incubator/setup-miniconda@v2
|
||||
with:
|
||||
miniforge-variant: Mambaforge
|
||||
use-mamba: true
|
||||
channels: conda-forge
|
||||
auto-update-conda: true
|
||||
activate-environment: ""
|
||||
auto-activate-base: false
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
sudo apt-get -y install zip pandoc man sed
|
||||
python -m pip install -U pip setuptools wheel
|
||||
python -m pip install -U Pyinstaller -r requirements.txt
|
||||
reqs=$(mktemp)
|
||||
cat > $reqs << EOF
|
||||
python=3.10.*
|
||||
pyinstaller
|
||||
cffi
|
||||
brotli-python
|
||||
EOF
|
||||
sed '/^brotli.*/d' requirements.txt >> $reqs
|
||||
mamba create -n build --file $reqs
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
python devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
|
||||
python devscripts/make_lazy_extractors.py
|
||||
- name: Build Unix platform-independent binary
|
||||
run: |
|
||||
make all tar
|
||||
- name: Build Unix standalone binary
|
||||
shell: bash -l {0}
|
||||
run: |
|
||||
unset LD_LIBRARY_PATH # Harmful; set by setup-python
|
||||
conda activate build
|
||||
python pyinst.py --onedir
|
||||
(cd ./dist/yt-dlp_linux && zip -r ../yt-dlp_linux.zip .)
|
||||
python pyinst.py
|
||||
mv ./dist/yt-dlp_linux ./yt-dlp_linux
|
||||
mv ./dist/yt-dlp_linux.zip ./yt-dlp_linux.zip
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
yt-dlp
|
||||
yt-dlp.tar.gz
|
||||
yt-dlp_linux
|
||||
yt-dlp_linux.zip
|
||||
|
||||
linux_arm:
|
||||
if: inputs.linux_arm
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write # for creating cache
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
architecture:
|
||||
- armv7
|
||||
- aarch64
|
||||
outputs:
|
||||
version_suffix: ${{ steps.version_suffix.outputs.version_suffix }}
|
||||
ytdlp_version: ${{ steps.bump_version.outputs.ytdlp_version }}
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }}
|
||||
sha256_bin: ${{ steps.sha256_bin.outputs.sha256_bin }}
|
||||
sha512_bin: ${{ steps.sha512_bin.outputs.sha512_bin }}
|
||||
sha256_tar: ${{ steps.sha256_tar.outputs.sha256_tar }}
|
||||
sha512_tar: ${{ steps.sha512_tar.outputs.sha512_tar }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
with:
|
||||
path: ./repo
|
||||
- name: Virtualized Install, Prepare & Build
|
||||
uses: yt-dlp/run-on-arch-action@v2
|
||||
with:
|
||||
# Ref: https://github.com/uraimo/run-on-arch-action/issues/55
|
||||
env: |
|
||||
GITHUB_WORKFLOW: build
|
||||
githubToken: ${{ github.token }} # To cache image
|
||||
arch: ${{ matrix.architecture }}
|
||||
distro: ubuntu18.04 # Standalone executable should be built on minimum supported OS
|
||||
dockerRunArgs: --volume "${PWD}/repo:/repo"
|
||||
install: | # Installing Python 3.10 from the Deadsnakes repo raises errors
|
||||
apt update
|
||||
apt -y install zlib1g-dev python3.8 python3.8-dev python3.8-distutils python3-pip
|
||||
python3.8 -m pip install -U pip setuptools wheel
|
||||
# Cannot access requirements.txt from the repo directory at this stage
|
||||
python3.8 -m pip install -U Pyinstaller mutagen pycryptodomex websockets brotli certifi
|
||||
- uses: actions/checkout@v2
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: '3.8'
|
||||
- name: Install packages
|
||||
run: sudo apt-get -y install zip pandoc man
|
||||
- name: Set version suffix
|
||||
id: version_suffix
|
||||
env:
|
||||
PUSH_VERSION_COMMIT: ${{ secrets.PUSH_VERSION_COMMIT }}
|
||||
if: "env.PUSH_VERSION_COMMIT == ''"
|
||||
run: echo ::set-output name=version_suffix::$(date -u +"%H%M%S")
|
||||
- name: Bump version
|
||||
id: bump_version
|
||||
run: |
|
||||
python devscripts/update-version.py ${{ steps.version_suffix.outputs.version_suffix }}
|
||||
make issuetemplates
|
||||
- name: Push to release
|
||||
id: push_release
|
||||
run: |
|
||||
git config --global user.name github-actions
|
||||
git config --global user.email github-actions@example.com
|
||||
git add -u
|
||||
git commit -m "[version] update" -m "Created by: ${{ github.event.sender.login }}" -m ":ci skip all"
|
||||
git push origin --force ${{ github.event.ref }}:release
|
||||
echo ::set-output name=head_sha::$(git rev-parse HEAD)
|
||||
- name: Update master
|
||||
id: push_master
|
||||
env:
|
||||
PUSH_VERSION_COMMIT: ${{ secrets.PUSH_VERSION_COMMIT }}
|
||||
if: "env.PUSH_VERSION_COMMIT != ''"
|
||||
run: git push origin ${{ github.event.ref }}
|
||||
- name: Get Changelog
|
||||
id: get_changelog
|
||||
run: |
|
||||
changelog=$(cat Changelog.md | grep -oPz '(?s)(?<=### ${{ steps.bump_version.outputs.ytdlp_version }}\n{2}).+?(?=\n{2,3}###)') || true
|
||||
echo "changelog<<EOF" >> $GITHUB_ENV
|
||||
echo "$changelog" >> $GITHUB_ENV
|
||||
echo "EOF" >> $GITHUB_ENV
|
||||
|
||||
run: |
|
||||
cd repo
|
||||
python3.8 -m pip install -U Pyinstaller -r requirements.txt # Cached version may be out of date
|
||||
python3.8 devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
|
||||
python3.8 devscripts/make_lazy_extractors.py
|
||||
python3.8 pyinst.py
|
||||
- name: Build lazy extractors
|
||||
id: lazy_extractors
|
||||
run: python devscripts/make_lazy_extractors.py
|
||||
- name: Run Make
|
||||
run: make all tar
|
||||
- name: Get SHA2-256SUMS for yt-dlp
|
||||
id: sha256_bin
|
||||
run: echo "::set-output name=sha256_bin::$(sha256sum yt-dlp | awk '{print $1}')"
|
||||
- name: Get SHA2-256SUMS for yt-dlp.tar.gz
|
||||
id: sha256_tar
|
||||
run: echo "::set-output name=sha256_tar::$(sha256sum yt-dlp.tar.gz | awk '{print $1}')"
|
||||
- name: Get SHA2-512SUMS for yt-dlp
|
||||
id: sha512_bin
|
||||
run: echo "::set-output name=sha512_bin::$(sha512sum yt-dlp | awk '{print $1}')"
|
||||
- name: Get SHA2-512SUMS for yt-dlp.tar.gz
|
||||
id: sha512_tar
|
||||
run: echo "::set-output name=sha512_tar::$(sha512sum yt-dlp.tar.gz | awk '{print $1}')"
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: | # run-on-arch-action designates armv7l as armv7
|
||||
repo/dist/yt-dlp_linux_${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}
|
||||
- name: Install dependencies for pypi
|
||||
env:
|
||||
PYPI_TOKEN: ${{ secrets.PYPI_TOKEN }}
|
||||
if: "env.PYPI_TOKEN != ''"
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install setuptools wheel twine
|
||||
- name: Build and publish on pypi
|
||||
env:
|
||||
TWINE_USERNAME: __token__
|
||||
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
|
||||
if: "env.TWINE_PASSWORD != ''"
|
||||
run: |
|
||||
rm -rf dist/*
|
||||
python setup.py sdist bdist_wheel
|
||||
twine upload dist/*
|
||||
|
||||
macos:
|
||||
if: inputs.macos
|
||||
- name: Install SSH private key
|
||||
env:
|
||||
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
|
||||
if: "env.BREW_TOKEN != ''"
|
||||
uses: yt-dlp/ssh-agent@v0.5.3
|
||||
with:
|
||||
ssh-private-key: ${{ env.BREW_TOKEN }}
|
||||
- name: Update Homebrew Formulae
|
||||
env:
|
||||
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
|
||||
if: "env.BREW_TOKEN != ''"
|
||||
run: |
|
||||
git clone git@github.com:yt-dlp/homebrew-taps taps/
|
||||
python3 devscripts/update-formulae.py taps/Formula/yt-dlp.rb "${{ steps.bump_version.outputs.ytdlp_version }}"
|
||||
git -C taps/ config user.name github-actions
|
||||
git -C taps/ config user.email github-actions@example.com
|
||||
git -C taps/ commit -am 'yt-dlp: ${{ steps.bump_version.outputs.ytdlp_version }}'
|
||||
git -C taps/ push
|
||||
|
||||
- name: Create Release
|
||||
id: create_release
|
||||
uses: actions/create-release@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
tag_name: ${{ steps.bump_version.outputs.ytdlp_version }}
|
||||
release_name: yt-dlp ${{ steps.bump_version.outputs.ytdlp_version }}
|
||||
commitish: ${{ steps.push_release.outputs.head_sha }}
|
||||
body: |
|
||||
#### [A description of the various files]((https://github.com/yt-dlp/yt-dlp#release-files)) are in the README
|
||||
|
||||
---
|
||||
|
||||
### Changelog:
|
||||
${{ env.changelog }}
|
||||
draft: false
|
||||
prerelease: false
|
||||
- name: Upload yt-dlp Unix binary
|
||||
id: upload-release-asset
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }}
|
||||
asset_path: ./yt-dlp
|
||||
asset_name: yt-dlp
|
||||
asset_content_type: application/octet-stream
|
||||
- name: Upload Source tar
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }}
|
||||
asset_path: ./yt-dlp.tar.gz
|
||||
asset_name: yt-dlp.tar.gz
|
||||
asset_content_type: application/gzip
|
||||
|
||||
build_macos:
|
||||
runs-on: macos-11
|
||||
needs: build_unix
|
||||
outputs:
|
||||
sha256_macos: ${{ steps.sha256_macos.outputs.sha256_macos }}
|
||||
sha512_macos: ${{ steps.sha512_macos.outputs.sha512_macos }}
|
||||
sha256_macos_zip: ${{ steps.sha256_macos_zip.outputs.sha256_macos_zip }}
|
||||
sha512_macos_zip: ${{ steps.sha512_macos_zip.outputs.sha512_macos_zip }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
# NB: In order to create a universal2 application, the version of python3 in /usr/bin has to be used
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
- uses: actions/checkout@v2
|
||||
# In order to create a universal2 application, the version of python3 in /usr/bin has to be used
|
||||
# Pyinstaller is pinned to 4.5.1 because the builds are failing in 4.6, 4.7
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
brew install coreutils
|
||||
/usr/bin/python3 -m pip install -U --user pip Pyinstaller -r requirements.txt
|
||||
/usr/bin/python3 -m pip install -U --user pip Pyinstaller==4.5.1 -r requirements.txt
|
||||
- name: Bump version
|
||||
id: bump_version
|
||||
run: /usr/bin/python3 devscripts/update-version.py
|
||||
- name: Build lazy extractors
|
||||
id: lazy_extractors
|
||||
run: /usr/bin/python3 devscripts/make_lazy_extractors.py
|
||||
- name: Run PyInstaller Script
|
||||
run: /usr/bin/python3 pyinst.py --target-architecture universal2 --onefile
|
||||
- name: Upload yt-dlp MacOS binary
|
||||
id: upload-release-macos
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.build_unix.outputs.upload_url }}
|
||||
asset_path: ./dist/yt-dlp_macos
|
||||
asset_name: yt-dlp_macos
|
||||
asset_content_type: application/octet-stream
|
||||
- name: Get SHA2-256SUMS for yt-dlp_macos
|
||||
id: sha256_macos
|
||||
run: echo "::set-output name=sha256_macos::$(sha256sum dist/yt-dlp_macos | awk '{print $1}')"
|
||||
- name: Get SHA2-512SUMS for yt-dlp_macos
|
||||
id: sha512_macos
|
||||
run: echo "::set-output name=sha512_macos::$(sha512sum dist/yt-dlp_macos | awk '{print $1}')"
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
/usr/bin/python3 devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
|
||||
/usr/bin/python3 devscripts/make_lazy_extractors.py
|
||||
- name: Build
|
||||
run: |
|
||||
/usr/bin/python3 pyinst.py --target-architecture universal2 --onedir
|
||||
(cd ./dist/yt-dlp_macos && zip -r ../yt-dlp_macos.zip .)
|
||||
/usr/bin/python3 pyinst.py --target-architecture universal2
|
||||
- name: Run PyInstaller Script with --onedir
|
||||
run: |
|
||||
/usr/bin/python3 pyinst.py --target-architecture universal2 --onedir
|
||||
zip ./dist/yt-dlp_macos.zip ./dist/yt-dlp_macos
|
||||
- name: Upload yt-dlp MacOS onedir
|
||||
id: upload-release-macos-zip
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.build_unix.outputs.upload_url }}
|
||||
asset_path: ./dist/yt-dlp_macos.zip
|
||||
asset_name: yt-dlp_macos.zip
|
||||
asset_content_type: application/zip
|
||||
- name: Get SHA2-256SUMS for yt-dlp_macos.zip
|
||||
id: sha256_macos_zip
|
||||
run: echo "::set-output name=sha256_macos_zip::$(sha256sum dist/yt-dlp_macos.zip | awk '{print $1}')"
|
||||
- name: Get SHA2-512SUMS for yt-dlp_macos.zip
|
||||
id: sha512_macos_zip
|
||||
run: echo "::set-output name=sha512_macos_zip::$(sha512sum dist/yt-dlp_macos.zip | awk '{print $1}')"
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
dist/yt-dlp_macos
|
||||
dist/yt-dlp_macos.zip
|
||||
|
||||
macos_legacy:
|
||||
if: inputs.macos_legacy
|
||||
runs-on: macos-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Install Python
|
||||
# We need the official Python, because the GA ones only support newer macOS versions
|
||||
env:
|
||||
PYTHON_VERSION: 3.10.5
|
||||
MACOSX_DEPLOYMENT_TARGET: 10.9 # Used up by the Python build tools
|
||||
run: |
|
||||
# Hack to get the latest patch version. Uncomment if needed
|
||||
#brew install python@3.10
|
||||
#export PYTHON_VERSION=$( $(brew --prefix)/opt/python@3.10/bin/python3 --version | cut -d ' ' -f 2 )
|
||||
curl https://www.python.org/ftp/python/${PYTHON_VERSION}/python-${PYTHON_VERSION}-macos11.pkg -o "python.pkg"
|
||||
sudo installer -pkg python.pkg -target /
|
||||
python3 --version
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
brew install coreutils
|
||||
python3 -m pip install -U --user pip Pyinstaller -r requirements.txt
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
python3 devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
|
||||
python3 devscripts/make_lazy_extractors.py
|
||||
- name: Build
|
||||
run: |
|
||||
python3 pyinst.py
|
||||
mv dist/yt-dlp_macos dist/yt-dlp_macos_legacy
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
dist/yt-dlp_macos_legacy
|
||||
|
||||
windows:
|
||||
if: inputs.windows
|
||||
build_windows:
|
||||
runs-on: windows-latest
|
||||
needs: build_unix
|
||||
outputs:
|
||||
sha256_win: ${{ steps.sha256_win.outputs.sha256_win }}
|
||||
sha512_win: ${{ steps.sha512_win.outputs.sha512_win }}
|
||||
sha256_py2exe: ${{ steps.sha256_py2exe.outputs.sha256_py2exe }}
|
||||
sha512_py2exe: ${{ steps.sha512_py2exe.outputs.sha512_py2exe }}
|
||||
sha256_win_zip: ${{ steps.sha256_win_zip.outputs.sha256_win_zip }}
|
||||
sha512_win_zip: ${{ steps.sha512_win_zip.outputs.sha512_win_zip }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/setup-python@v4
|
||||
with: # 3.8 is used for Win7 support
|
||||
python-version: "3.8"
|
||||
- name: Install Requirements
|
||||
run: | # Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
|
||||
python -m pip install -U pip setuptools wheel py2exe
|
||||
pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-5.8.0-py3-none-any.whl" -r requirements.txt
|
||||
- uses: actions/checkout@v2
|
||||
# 3.8 is used for Win7 support
|
||||
- name: Set up Python 3.8
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: '3.8'
|
||||
- name: Install Requirements
|
||||
# Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
|
||||
run: |
|
||||
python -m pip install --upgrade pip setuptools wheel py2exe
|
||||
pip install "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-4.5.1-py3-none-any.whl" -r requirements.txt
|
||||
- name: Bump version
|
||||
id: bump_version
|
||||
env:
|
||||
version_suffix: ${{ needs.build_unix.outputs.version_suffix }}
|
||||
run: python devscripts/update-version.py ${{ env.version_suffix }}
|
||||
- name: Build lazy extractors
|
||||
id: lazy_extractors
|
||||
run: python devscripts/make_lazy_extractors.py
|
||||
- name: Run PyInstaller Script
|
||||
run: python pyinst.py
|
||||
- name: Upload yt-dlp.exe Windows binary
|
||||
id: upload-release-windows
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.build_unix.outputs.upload_url }}
|
||||
asset_path: ./dist/yt-dlp.exe
|
||||
asset_name: yt-dlp.exe
|
||||
asset_content_type: application/vnd.microsoft.portable-executable
|
||||
- name: Get SHA2-256SUMS for yt-dlp.exe
|
||||
id: sha256_win
|
||||
run: echo "::set-output name=sha256_win::$((Get-FileHash dist\yt-dlp.exe -Algorithm SHA256).Hash.ToLower())"
|
||||
- name: Get SHA2-512SUMS for yt-dlp.exe
|
||||
id: sha512_win
|
||||
run: echo "::set-output name=sha512_win::$((Get-FileHash dist\yt-dlp.exe -Algorithm SHA512).Hash.ToLower())"
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
python devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
|
||||
python devscripts/make_lazy_extractors.py
|
||||
- name: Build
|
||||
run: |
|
||||
python setup.py py2exe
|
||||
Move-Item ./dist/yt-dlp.exe ./dist/yt-dlp_min.exe
|
||||
python pyinst.py
|
||||
python pyinst.py --onedir
|
||||
Compress-Archive -Path ./dist/yt-dlp/* -DestinationPath ./dist/yt-dlp_win.zip
|
||||
- name: Run PyInstaller Script with --onedir
|
||||
run: |
|
||||
python pyinst.py --onedir
|
||||
Compress-Archive -LiteralPath ./dist/yt-dlp -DestinationPath ./dist/yt-dlp_win.zip
|
||||
- name: Upload yt-dlp Windows onedir
|
||||
id: upload-release-windows-zip
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.build_unix.outputs.upload_url }}
|
||||
asset_path: ./dist/yt-dlp_win.zip
|
||||
asset_name: yt-dlp_win.zip
|
||||
asset_content_type: application/zip
|
||||
- name: Get SHA2-256SUMS for yt-dlp_win.zip
|
||||
id: sha256_win_zip
|
||||
run: echo "::set-output name=sha256_win_zip::$((Get-FileHash dist\yt-dlp_win.zip -Algorithm SHA256).Hash.ToLower())"
|
||||
- name: Get SHA2-512SUMS for yt-dlp_win.zip
|
||||
id: sha512_win_zip
|
||||
run: echo "::set-output name=sha512_win_zip::$((Get-FileHash dist\yt-dlp_win.zip -Algorithm SHA512).Hash.ToLower())"
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
dist/yt-dlp.exe
|
||||
dist/yt-dlp_min.exe
|
||||
dist/yt-dlp_win.zip
|
||||
- name: Run py2exe Script
|
||||
run: python setup.py py2exe
|
||||
- name: Upload yt-dlp_min.exe Windows binary
|
||||
id: upload-release-windows-py2exe
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.build_unix.outputs.upload_url }}
|
||||
asset_path: ./dist/yt-dlp.exe
|
||||
asset_name: yt-dlp_min.exe
|
||||
asset_content_type: application/vnd.microsoft.portable-executable
|
||||
- name: Get SHA2-256SUMS for yt-dlp_min.exe
|
||||
id: sha256_py2exe
|
||||
run: echo "::set-output name=sha256_py2exe::$((Get-FileHash dist\yt-dlp.exe -Algorithm SHA256).Hash.ToLower())"
|
||||
- name: Get SHA2-512SUMS for yt-dlp_min.exe
|
||||
id: sha512_py2exe
|
||||
run: echo "::set-output name=sha512_py2exe::$((Get-FileHash dist\yt-dlp.exe -Algorithm SHA512).Hash.ToLower())"
|
||||
|
||||
windows32:
|
||||
if: inputs.windows32
|
||||
build_windows32:
|
||||
runs-on: windows-latest
|
||||
needs: build_unix
|
||||
|
||||
outputs:
|
||||
sha256_win32: ${{ steps.sha256_win32.outputs.sha256_win32 }}
|
||||
sha512_win32: ${{ steps.sha512_win32.outputs.sha512_win32 }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/setup-python@v4
|
||||
with: # 3.7 is used for Vista support. See https://github.com/yt-dlp/yt-dlp/issues/390
|
||||
python-version: "3.7"
|
||||
architecture: "x86"
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
python -m pip install -U pip setuptools wheel
|
||||
pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-5.8.0-py3-none-any.whl" -r requirements.txt
|
||||
- uses: actions/checkout@v2
|
||||
# 3.7 is used for Vista support. See https://github.com/yt-dlp/yt-dlp/issues/390
|
||||
- name: Set up Python 3.7 32-Bit
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: '3.7'
|
||||
architecture: 'x86'
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
python -m pip install --upgrade pip setuptools wheel
|
||||
pip install "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-4.5.1-py3-none-any.whl" -r requirements.txt
|
||||
- name: Bump version
|
||||
id: bump_version
|
||||
env:
|
||||
version_suffix: ${{ needs.build_unix.outputs.version_suffix }}
|
||||
run: python devscripts/update-version.py ${{ env.version_suffix }}
|
||||
- name: Build lazy extractors
|
||||
id: lazy_extractors
|
||||
run: python devscripts/make_lazy_extractors.py
|
||||
- name: Run PyInstaller Script for 32 Bit
|
||||
run: python pyinst.py
|
||||
- name: Upload Executable yt-dlp_x86.exe
|
||||
id: upload-release-windows32
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.build_unix.outputs.upload_url }}
|
||||
asset_path: ./dist/yt-dlp_x86.exe
|
||||
asset_name: yt-dlp_x86.exe
|
||||
asset_content_type: application/vnd.microsoft.portable-executable
|
||||
- name: Get SHA2-256SUMS for yt-dlp_x86.exe
|
||||
id: sha256_win32
|
||||
run: echo "::set-output name=sha256_win32::$((Get-FileHash dist\yt-dlp_x86.exe -Algorithm SHA256).Hash.ToLower())"
|
||||
- name: Get SHA2-512SUMS for yt-dlp_x86.exe
|
||||
id: sha512_win32
|
||||
run: echo "::set-output name=sha512_win32::$((Get-FileHash dist\yt-dlp_x86.exe -Algorithm SHA512).Hash.ToLower())"
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
python devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
|
||||
python devscripts/make_lazy_extractors.py
|
||||
- name: Build
|
||||
run: |
|
||||
python pyinst.py
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
dist/yt-dlp_x86.exe
|
||||
|
||||
meta_files:
|
||||
if: inputs.meta_files && always()
|
||||
needs:
|
||||
- unix
|
||||
- linux_arm
|
||||
- macos
|
||||
- macos_legacy
|
||||
- windows
|
||||
- windows32
|
||||
finish:
|
||||
runs-on: ubuntu-latest
|
||||
needs: [build_unix, build_windows, build_windows32, build_macos]
|
||||
|
||||
steps:
|
||||
- uses: actions/download-artifact@v3
|
||||
|
||||
- name: Make SHA2-SUMS files
|
||||
run: |
|
||||
cd ./artifact/
|
||||
sha256sum * > ../SHA2-256SUMS
|
||||
sha512sum * > ../SHA2-512SUMS
|
||||
|
||||
- name: Make Update spec
|
||||
run: |
|
||||
cat >> _update_spec << EOF
|
||||
# This file is used for regulating self-update
|
||||
lock 2022.08.18.36 .+ Python 3.6
|
||||
EOF
|
||||
|
||||
- name: Sign checksum files
|
||||
env:
|
||||
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
|
||||
if: env.GPG_SIGNING_KEY != ''
|
||||
run: |
|
||||
gpg --batch --import <<< "${{ secrets.GPG_SIGNING_KEY }}"
|
||||
for signfile in ./SHA*SUMS; do
|
||||
gpg --batch --detach-sign "$signfile"
|
||||
done
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
SHA*SUMS*
|
||||
_update_spec
|
||||
- name: Make SHA2-256SUMS file
|
||||
env:
|
||||
SHA256_BIN: ${{ needs.build_unix.outputs.sha256_bin }}
|
||||
SHA256_TAR: ${{ needs.build_unix.outputs.sha256_tar }}
|
||||
SHA256_WIN: ${{ needs.build_windows.outputs.sha256_win }}
|
||||
SHA256_PY2EXE: ${{ needs.build_windows.outputs.sha256_py2exe }}
|
||||
SHA256_WIN_ZIP: ${{ needs.build_windows.outputs.sha256_win_zip }}
|
||||
SHA256_WIN32: ${{ needs.build_windows32.outputs.sha256_win32 }}
|
||||
SHA256_MACOS: ${{ needs.build_macos.outputs.sha256_macos }}
|
||||
SHA256_MACOS_ZIP: ${{ needs.build_macos.outputs.sha256_macos_zip }}
|
||||
run: |
|
||||
echo "${{ env.SHA256_BIN }} yt-dlp" >> SHA2-256SUMS
|
||||
echo "${{ env.SHA256_TAR }} yt-dlp.tar.gz" >> SHA2-256SUMS
|
||||
echo "${{ env.SHA256_WIN }} yt-dlp.exe" >> SHA2-256SUMS
|
||||
echo "${{ env.SHA256_PY2EXE }} yt-dlp_min.exe" >> SHA2-256SUMS
|
||||
echo "${{ env.SHA256_WIN32 }} yt-dlp_x86.exe" >> SHA2-256SUMS
|
||||
echo "${{ env.SHA256_WIN_ZIP }} yt-dlp_win.zip" >> SHA2-256SUMS
|
||||
echo "${{ env.SHA256_MACOS }} yt-dlp_macos" >> SHA2-256SUMS
|
||||
echo "${{ env.SHA256_MACOS_ZIP }} yt-dlp_macos.zip" >> SHA2-256SUMS
|
||||
- name: Upload 256SUMS file
|
||||
id: upload-sums
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.build_unix.outputs.upload_url }}
|
||||
asset_path: ./SHA2-256SUMS
|
||||
asset_name: SHA2-256SUMS
|
||||
asset_content_type: text/plain
|
||||
- name: Make SHA2-512SUMS file
|
||||
env:
|
||||
SHA512_BIN: ${{ needs.build_unix.outputs.sha512_bin }}
|
||||
SHA512_TAR: ${{ needs.build_unix.outputs.sha512_tar }}
|
||||
SHA512_WIN: ${{ needs.build_windows.outputs.sha512_win }}
|
||||
SHA512_PY2EXE: ${{ needs.build_windows.outputs.sha512_py2exe }}
|
||||
SHA512_WIN_ZIP: ${{ needs.build_windows.outputs.sha512_win_zip }}
|
||||
SHA512_WIN32: ${{ needs.build_windows32.outputs.sha512_win32 }}
|
||||
SHA512_MACOS: ${{ needs.build_macos.outputs.sha512_macos }}
|
||||
SHA512_MACOS_ZIP: ${{ needs.build_macos.outputs.sha512_macos_zip }}
|
||||
run: |
|
||||
echo "${{ env.SHA512_BIN }} yt-dlp" >> SHA2-512SUMS
|
||||
echo "${{ env.SHA512_TAR }} yt-dlp.tar.gz" >> SHA2-512SUMS
|
||||
echo "${{ env.SHA512_WIN }} yt-dlp.exe" >> SHA2-512SUMS
|
||||
echo "${{ env.SHA512_WIN_ZIP }} yt-dlp_win.zip" >> SHA2-512SUMS
|
||||
echo "${{ env.SHA512_PY2EXE }} yt-dlp_min.exe" >> SHA2-512SUMS
|
||||
echo "${{ env.SHA512_WIN32 }} yt-dlp_x86.exe" >> SHA2-512SUMS
|
||||
echo "${{ env.SHA512_MACOS }} yt-dlp_macos" >> SHA2-512SUMS
|
||||
echo "${{ env.SHA512_MACOS_ZIP }} yt-dlp_macos.zip" >> SHA2-512SUMS
|
||||
- name: Upload 512SUMS file
|
||||
id: upload-512sums
|
||||
uses: actions/upload-release-asset@v1
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ needs.build_unix.outputs.upload_url }}
|
||||
asset_path: ./SHA2-512SUMS
|
||||
asset_name: SHA2-512SUMS
|
||||
asset_content_type: text/plain
|
||||
|
||||
25
.github/workflows/core.yml
vendored
25
.github/workflows/core.yml
vendored
@@ -1,8 +1,5 @@
|
||||
name: Core Tests
|
||||
on: [push, pull_request]
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
tests:
|
||||
name: Core Tests
|
||||
@@ -11,28 +8,24 @@ jobs:
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
os: [ubuntu-latest]
|
||||
# CPython 3.11 is in quick-test
|
||||
python-version: ['3.8', '3.9', '3.10', pypy-3.7, pypy-3.8]
|
||||
os: [ubuntu-18.04]
|
||||
# py3.9 is in quick-test
|
||||
python-version: [3.7, 3.8, 3.10-dev, pypy-3.6, pypy-3.7]
|
||||
run-tests-ext: [sh]
|
||||
include:
|
||||
# atleast one of each CPython/PyPy tests must be in windows
|
||||
# atleast one of the tests must be in windows
|
||||
- os: windows-latest
|
||||
python-version: '3.7'
|
||||
run-tests-ext: bat
|
||||
- os: windows-latest
|
||||
python-version: pypy-3.9
|
||||
python-version: 3.6
|
||||
run-tests-ext: bat
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/checkout@v2
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v4
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
- name: Install pytest
|
||||
run: pip install pytest
|
||||
- name: Run tests
|
||||
continue-on-error: False
|
||||
run: |
|
||||
python3 -m yt_dlp -v || true # Print debug head
|
||||
./devscripts/run_tests.${{ matrix.run-tests-ext }} core
|
||||
run: ./devscripts/run_tests.${{ matrix.run-tests-ext }} core
|
||||
# Linter is in quick-test
|
||||
|
||||
37
.github/workflows/download.yml
vendored
37
.github/workflows/download.yml
vendored
@@ -1,47 +1,24 @@
|
||||
name: Download Tests
|
||||
on: [push, pull_request]
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
quick:
|
||||
name: Quick Download Tests
|
||||
tests:
|
||||
name: Download Tests
|
||||
if: "contains(github.event.head_commit.message, 'ci run dl')"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: 3.9
|
||||
- name: Install test requirements
|
||||
run: pip install pytest
|
||||
- name: Run tests
|
||||
continue-on-error: true
|
||||
run: ./devscripts/run_tests.sh download
|
||||
|
||||
full:
|
||||
name: Full Download Tests
|
||||
if: "contains(github.event.head_commit.message, 'ci run dl all')"
|
||||
runs-on: ${{ matrix.os }}
|
||||
strategy:
|
||||
fail-fast: true
|
||||
matrix:
|
||||
os: [ubuntu-latest]
|
||||
python-version: ['3.7', '3.10', 3.11-dev, pypy-3.7, pypy-3.8]
|
||||
os: [ubuntu-18.04]
|
||||
python-version: [3.7, 3.8, 3.9, 3.10-dev, pypy-3.6, pypy-3.7]
|
||||
run-tests-ext: [sh]
|
||||
include:
|
||||
# atleast one of each CPython/PyPy tests must be in windows
|
||||
- os: windows-latest
|
||||
python-version: '3.8'
|
||||
run-tests-ext: bat
|
||||
- os: windows-latest
|
||||
python-version: pypy-3.9
|
||||
python-version: 3.6
|
||||
run-tests-ext: bat
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/checkout@v2
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v4
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
- name: Install pytest
|
||||
|
||||
81
.github/workflows/publish.yml
vendored
81
.github/workflows/publish.yml
vendored
@@ -1,81 +0,0 @@
|
||||
name: Publish
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
nightly:
|
||||
default: false
|
||||
required: false
|
||||
type: boolean
|
||||
version:
|
||||
required: true
|
||||
type: string
|
||||
target_commitish:
|
||||
required: true
|
||||
type: string
|
||||
secrets:
|
||||
ARCHIVE_REPO_TOKEN:
|
||||
required: false
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
|
||||
jobs:
|
||||
publish:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- uses: actions/download-artifact@v3
|
||||
- uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.10"
|
||||
|
||||
- name: Generate release notes
|
||||
run: |
|
||||
cat >> ./RELEASE_NOTES << EOF
|
||||
#### A description of the various files are in the [README](https://github.com/yt-dlp/yt-dlp#release-files)
|
||||
---
|
||||
<details><summary><h3>Changelog</h3></summary>
|
||||
$(python ./devscripts/make_changelog.py -vv)
|
||||
</details>
|
||||
EOF
|
||||
echo "**This is an automated nightly pre-release build**" >> ./PRERELEASE_NOTES
|
||||
cat ./RELEASE_NOTES >> ./PRERELEASE_NOTES
|
||||
echo "Generated from: https://github.com/${{ github.repository }}/commit/${{ inputs.target_commitish }}" >> ./ARCHIVE_NOTES
|
||||
cat ./RELEASE_NOTES >> ./ARCHIVE_NOTES
|
||||
|
||||
- name: Archive nightly release
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.ARCHIVE_REPO_TOKEN }}
|
||||
GH_REPO: ${{ vars.ARCHIVE_REPO }}
|
||||
if: |
|
||||
inputs.nightly && env.GH_TOKEN != '' && env.GH_REPO != ''
|
||||
run: |
|
||||
gh release create \
|
||||
--notes-file ARCHIVE_NOTES \
|
||||
--title "yt-dlp nightly ${{ inputs.version }}" \
|
||||
${{ inputs.version }} \
|
||||
artifact/*
|
||||
|
||||
- name: Prune old nightly release
|
||||
if: inputs.nightly && !vars.ARCHIVE_REPO
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
run: |
|
||||
gh release delete --yes --cleanup-tag "nightly" || true
|
||||
git tag --delete "nightly" || true
|
||||
sleep 5 # Enough time to cover deletion race condition
|
||||
|
||||
- name: Publish release${{ inputs.nightly && ' (nightly)' || '' }}
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
if: (inputs.nightly && !vars.ARCHIVE_REPO) || !inputs.nightly
|
||||
run: |
|
||||
gh release create \
|
||||
--notes-file ${{ inputs.nightly && 'PRE' || '' }}RELEASE_NOTES \
|
||||
--target ${{ inputs.target_commitish }} \
|
||||
--title "yt-dlp ${{ inputs.nightly && 'nightly ' || '' }}${{ inputs.version }}" \
|
||||
${{ inputs.nightly && '--prerelease "nightly"' || inputs.version }} \
|
||||
artifact/*
|
||||
22
.github/workflows/quick-test.yml
vendored
22
.github/workflows/quick-test.yml
vendored
@@ -1,32 +1,30 @@
|
||||
name: Quick Test
|
||||
on: [push, pull_request]
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
tests:
|
||||
name: Core Test
|
||||
if: "!contains(github.event.head_commit.message, 'ci skip all')"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Set up Python 3.11
|
||||
uses: actions/setup-python@v4
|
||||
- uses: actions/checkout@v2
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: '3.11'
|
||||
python-version: 3.9
|
||||
- name: Install test requirements
|
||||
run: pip install pytest pycryptodomex
|
||||
- name: Run tests
|
||||
run: |
|
||||
python3 -m yt_dlp -v || true
|
||||
./devscripts/run_tests.sh core
|
||||
run: ./devscripts/run_tests.sh core
|
||||
flake8:
|
||||
name: Linter
|
||||
if: "!contains(github.event.head_commit.message, 'ci skip all')"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/setup-python@v4
|
||||
- uses: actions/checkout@v2
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: 3.9
|
||||
- name: Install flake8
|
||||
run: pip install flake8
|
||||
- name: Make lazy extractors
|
||||
|
||||
51
.github/workflows/release-nightly.yml
vendored
51
.github/workflows/release-nightly.yml
vendored
@@ -1,51 +0,0 @@
|
||||
name: Release (nightly)
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
paths:
|
||||
- "yt_dlp/**.py"
|
||||
- "!yt_dlp/version.py"
|
||||
concurrency:
|
||||
group: release-nightly
|
||||
cancel-in-progress: true
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
prepare:
|
||||
if: vars.BUILD_NIGHTLY != ''
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
version: ${{ steps.get_version.outputs.version }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Get version
|
||||
id: get_version
|
||||
run: |
|
||||
python devscripts/update-version.py "$(date -u +"%H%M%S")" | grep -Po "version=\d+(\.\d+){3}" >> "$GITHUB_OUTPUT"
|
||||
|
||||
build:
|
||||
needs: prepare
|
||||
uses: ./.github/workflows/build.yml
|
||||
with:
|
||||
version: ${{ needs.prepare.outputs.version }}
|
||||
channel: nightly
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write # For package cache
|
||||
secrets:
|
||||
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
|
||||
|
||||
publish:
|
||||
needs: [prepare, build]
|
||||
uses: ./.github/workflows/publish.yml
|
||||
secrets:
|
||||
ARCHIVE_REPO_TOKEN: ${{ secrets.ARCHIVE_REPO_TOKEN }}
|
||||
permissions:
|
||||
contents: write
|
||||
with:
|
||||
nightly: true
|
||||
version: ${{ needs.prepare.outputs.version }}
|
||||
target_commitish: ${{ github.sha }}
|
||||
129
.github/workflows/release.yml
vendored
129
.github/workflows/release.yml
vendored
@@ -1,129 +0,0 @@
|
||||
name: Release
|
||||
on: workflow_dispatch
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
prepare:
|
||||
permissions:
|
||||
contents: write
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
version: ${{ steps.update_version.outputs.version }}
|
||||
head_sha: ${{ steps.push_release.outputs.head_sha }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.10"
|
||||
|
||||
- name: Update version
|
||||
id: update_version
|
||||
run: |
|
||||
python devscripts/update-version.py ${{ vars.PUSH_VERSION_COMMIT == '' && '"$(date -u +"%H%M%S")"' || '' }} | \
|
||||
grep -Po "version=\d+\.\d+\.\d+(\.\d+)?" >> "$GITHUB_OUTPUT"
|
||||
|
||||
- name: Update documentation
|
||||
run: |
|
||||
make doc
|
||||
sed '/### /Q' Changelog.md >> ./CHANGELOG
|
||||
echo '### ${{ steps.update_version.outputs.version }}' >> ./CHANGELOG
|
||||
python ./devscripts/make_changelog.py -vv -c >> ./CHANGELOG
|
||||
echo >> ./CHANGELOG
|
||||
grep -Poz '(?s)### \d+\.\d+\.\d+.+' 'Changelog.md' | head -n -1 >> ./CHANGELOG
|
||||
cat ./CHANGELOG > Changelog.md
|
||||
|
||||
- name: Push to release
|
||||
id: push_release
|
||||
run: |
|
||||
git config --global user.name github-actions
|
||||
git config --global user.email github-actions@example.com
|
||||
git add -u
|
||||
git commit -m "Release ${{ steps.update_version.outputs.version }}" \
|
||||
-m "Created by: ${{ github.event.sender.login }}" -m ":ci skip all :ci run dl"
|
||||
git push origin --force ${{ github.event.ref }}:release
|
||||
echo "head_sha=$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"
|
||||
|
||||
- name: Update master
|
||||
if: vars.PUSH_VERSION_COMMIT != ''
|
||||
run: git push origin ${{ github.event.ref }}
|
||||
|
||||
publish_pypi_homebrew:
|
||||
needs: prepare
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: "3.10"
|
||||
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
sudo apt-get -y install pandoc man
|
||||
python -m pip install -U pip setuptools wheel twine
|
||||
python -m pip install -U -r requirements.txt
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
python devscripts/update-version.py ${{ needs.prepare.outputs.version }}
|
||||
python devscripts/make_lazy_extractors.py
|
||||
|
||||
- name: Build and publish on PyPI
|
||||
env:
|
||||
TWINE_USERNAME: __token__
|
||||
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
|
||||
if: env.TWINE_PASSWORD != ''
|
||||
run: |
|
||||
rm -rf dist/*
|
||||
make pypi-files
|
||||
python devscripts/set-variant.py pip -M "You installed yt-dlp with pip or using the wheel from PyPi; Use that to update"
|
||||
python setup.py sdist bdist_wheel
|
||||
twine upload dist/*
|
||||
|
||||
- name: Checkout Homebrew repository
|
||||
env:
|
||||
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
|
||||
PYPI_TOKEN: ${{ secrets.PYPI_TOKEN }}
|
||||
if: env.BREW_TOKEN != '' && env.PYPI_TOKEN != ''
|
||||
uses: actions/checkout@v3
|
||||
with:
|
||||
repository: yt-dlp/homebrew-taps
|
||||
path: taps
|
||||
ssh-key: ${{ secrets.BREW_TOKEN }}
|
||||
|
||||
- name: Update Homebrew Formulae
|
||||
env:
|
||||
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
|
||||
PYPI_TOKEN: ${{ secrets.PYPI_TOKEN }}
|
||||
if: env.BREW_TOKEN != '' && env.PYPI_TOKEN != ''
|
||||
run: |
|
||||
python devscripts/update-formulae.py taps/Formula/yt-dlp.rb "${{ needs.prepare.outputs.version }}"
|
||||
git -C taps/ config user.name github-actions
|
||||
git -C taps/ config user.email github-actions@example.com
|
||||
git -C taps/ commit -am 'yt-dlp: ${{ needs.prepare.outputs.version }}'
|
||||
git -C taps/ push
|
||||
|
||||
build:
|
||||
needs: prepare
|
||||
uses: ./.github/workflows/build.yml
|
||||
with:
|
||||
version: ${{ needs.prepare.outputs.version }}
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write # For package cache
|
||||
secrets:
|
||||
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
|
||||
|
||||
publish:
|
||||
needs: [prepare, build]
|
||||
uses: ./.github/workflows/publish.yml
|
||||
permissions:
|
||||
contents: write
|
||||
with:
|
||||
version: ${{ needs.prepare.outputs.version }}
|
||||
target_commitish: ${{ needs.prepare.outputs.head_sha }}
|
||||
17
.gitignore
vendored
17
.gitignore
vendored
@@ -24,13 +24,10 @@ cookies
|
||||
|
||||
*.3gp
|
||||
*.ape
|
||||
*.ass
|
||||
*.avi
|
||||
*.desktop
|
||||
*.f4v
|
||||
*.flac
|
||||
*.flv
|
||||
*.gif
|
||||
*.jpeg
|
||||
*.jpg
|
||||
*.m4a
|
||||
@@ -40,8 +37,6 @@ cookies
|
||||
*.mov
|
||||
*.mp3
|
||||
*.mp4
|
||||
*.mpga
|
||||
*.oga
|
||||
*.ogg
|
||||
*.opus
|
||||
*.png
|
||||
@@ -49,7 +44,6 @@ cookies
|
||||
*.srt
|
||||
*.swf
|
||||
*.swp
|
||||
*.tt
|
||||
*.ttml
|
||||
*.url
|
||||
*.vtt
|
||||
@@ -72,7 +66,6 @@ dist/
|
||||
zip/
|
||||
tmp/
|
||||
venv/
|
||||
.venv/
|
||||
completions/
|
||||
|
||||
# Misc
|
||||
@@ -88,8 +81,6 @@ updates_key.pem
|
||||
*.egg-info
|
||||
.tox
|
||||
*.class
|
||||
*.isorted
|
||||
*.stackdump
|
||||
|
||||
# Generated
|
||||
AUTHORS
|
||||
@@ -101,7 +92,7 @@ README.txt
|
||||
*.tar.gz
|
||||
*.zsh
|
||||
*.spec
|
||||
test/testdata/sigs/player-*.js
|
||||
test/testdata/player-*.js
|
||||
|
||||
# Binary
|
||||
/youtube-dl
|
||||
@@ -115,11 +106,11 @@ yt-dlp.zip
|
||||
*.iml
|
||||
.vscode
|
||||
*.sublime-*
|
||||
*.code-workspace
|
||||
|
||||
# Lazy extractors
|
||||
*/extractor/lazy_extractors.py
|
||||
|
||||
# Plugins
|
||||
ytdlp_plugins/
|
||||
yt-dlp-plugins
|
||||
ytdlp_plugins/extractor/*
|
||||
!ytdlp_plugins/extractor/__init__.py
|
||||
!ytdlp_plugins/extractor/sample.py
|
||||
|
||||
22
.readthedocs.yml
Normal file
22
.readthedocs.yml
Normal file
@@ -0,0 +1,22 @@
|
||||
# .readthedocs.yaml
|
||||
# Read the Docs configuration file
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Build documentation in the docs/ directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF
|
||||
formats:
|
||||
- epub
|
||||
- pdf
|
||||
- htmlzip
|
||||
|
||||
# Optionally set the version of Python and requirements required to build your docs
|
||||
python:
|
||||
version: 3
|
||||
install:
|
||||
- requirements: docs/requirements.txt
|
||||
173
CONTRIBUTING.md
173
CONTRIBUTING.md
@@ -11,7 +11,6 @@ # CONTRIBUTING TO YT-DLP
|
||||
- [Is anyone going to need the feature?](#is-anyone-going-to-need-the-feature)
|
||||
- [Is your question about yt-dlp?](#is-your-question-about-yt-dlp)
|
||||
- [Are you willing to share account details if needed?](#are-you-willing-to-share-account-details-if-needed)
|
||||
- [Is the website primarily used for piracy](#is-the-website-primarily-used-for-piracy)
|
||||
- [DEVELOPER INSTRUCTIONS](#developer-instructions)
|
||||
- [Adding new feature or making overarching changes](#adding-new-feature-or-making-overarching-changes)
|
||||
- [Adding support for a new site](#adding-support-for-a-new-site)
|
||||
@@ -25,7 +24,6 @@ # CONTRIBUTING TO YT-DLP
|
||||
- [Collapse fallbacks](#collapse-fallbacks)
|
||||
- [Trailing parentheses](#trailing-parentheses)
|
||||
- [Use convenience conversion and parsing functions](#use-convenience-conversion-and-parsing-functions)
|
||||
- [My pull request is labeled pending-fixes](#my-pull-request-is-labeled-pending-fixes)
|
||||
- [EMBEDDING YT-DLP](README.md#embedding-yt-dlp)
|
||||
|
||||
|
||||
@@ -115,7 +113,7 @@ ### Is your question about yt-dlp?
|
||||
|
||||
### Are you willing to share account details if needed?
|
||||
|
||||
The maintainers and potential contributors of the project often do not have an account for the website you are asking support for. So any developer interested in solving your issue may ask you for account details. It is your personal discretion whether you are willing to share the account in order for the developer to try and solve your issue. However, if you are unwilling or unable to provide details, they obviously cannot work on the issue and it cannot be solved unless some developer who both has an account and is willing/able to contribute decides to solve it.
|
||||
The maintainers and potential contributors of the project often do not have an account for the website you are asking support for. So any developer interested in solving your issue may ask you for account details. It is your personal discression whether you are willing to share the account in order for the developer to try and solve your issue. However, if you are unwilling or unable to provide details, they obviously cannot work on the issue and it cannot be solved unless some developer who both has an account and is willing/able to contribute decides to solve it.
|
||||
|
||||
By sharing an account with anyone, you agree to bear all risks associated with it. The maintainers and yt-dlp can't be held responsible for any misuse of the credentials.
|
||||
|
||||
@@ -125,10 +123,6 @@ ### Are you willing to share account details if needed?
|
||||
- Change the password before sharing the account to something random (use [this](https://passwordsgenerator.net/) if you don't have a random password generator).
|
||||
- Change the password after receiving the account back.
|
||||
|
||||
### Is the website primarily used for piracy?
|
||||
|
||||
We follow [youtube-dl's policy](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) to not support services that is primarily used for infringing copyright. Additionally, it has been decided to not to support porn sites that specialize in fakes. We also cannot support any service that serves only [DRM protected content](https://en.wikipedia.org/wiki/Digital_rights_management).
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -161,7 +155,7 @@ ## Adding new feature or making overarching changes
|
||||
|
||||
## Adding support for a new site
|
||||
|
||||
If you want to add support for a new site, first of all **make sure** this site is **not dedicated to [copyright infringement](#is-the-website-primarily-used-for-piracy)**. yt-dlp does **not support** such sites thus pull requests adding support for them **will be rejected**.
|
||||
If you want to add support for a new site, first of all **make sure** this site is **not dedicated to [copyright infringement](https://www.github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free)**. yt-dlp does **not support** such sites thus pull requests adding support for them **will be rejected**.
|
||||
|
||||
After you have ensured this site is distributing its content legally, you can follow this quick list (assuming your service is called `yourextractor`):
|
||||
|
||||
@@ -178,6 +172,7 @@ ## Adding support for a new site
|
||||
1. Start with this simple template and save it to `yt_dlp/extractor/yourextractor.py`:
|
||||
|
||||
```python
|
||||
# coding: utf-8
|
||||
from .common import InfoExtractor
|
||||
|
||||
|
||||
@@ -195,7 +190,7 @@ ## Adding support for a new site
|
||||
# * A value
|
||||
# * MD5 checksum; start the string with md5:
|
||||
# * A regular expression; start the string with re:
|
||||
# * Any Python type, e.g. int or float
|
||||
# * Any Python type (for example int or float)
|
||||
}
|
||||
}]
|
||||
|
||||
@@ -214,18 +209,18 @@ ## Adding support for a new site
|
||||
# TODO more properties (see yt_dlp/extractor/common.py)
|
||||
}
|
||||
```
|
||||
1. Add an import in [`yt_dlp/extractor/_extractors.py`](yt_dlp/extractor/_extractors.py). Note that the class name must end with `IE`.
|
||||
1. Run `python test/test_download.py TestDownload.test_YourExtractor` (note that `YourExtractor` doesn't end with `IE`). This *should fail* at first, but you can continually re-run it until you're done. If you decide to add more than one test, the tests will then be named `TestDownload.test_YourExtractor`, `TestDownload.test_YourExtractor_1`, `TestDownload.test_YourExtractor_2`, etc. Note that tests with `only_matching` key in test's dict are not counted in. You can also run all the tests in one go with `TestDownload.test_YourExtractor_all`
|
||||
1. Add an import in [`yt_dlp/extractor/extractors.py`](yt_dlp/extractor/extractors.py).
|
||||
1. Run `python test/test_download.py TestDownload.test_YourExtractor`. This *should fail* at first, but you can continually re-run it until you're done. If you decide to add more than one test, the tests will then be named `TestDownload.test_YourExtractor`, `TestDownload.test_YourExtractor_1`, `TestDownload.test_YourExtractor_2`, etc. Note that tests with `only_matching` key in test's dict are not counted in. You can also run all the tests in one go with `TestDownload.test_YourExtractor_all`
|
||||
1. Make sure you have atleast one test for your extractor. Even if all videos covered by the extractor are expected to be inaccessible for automated testing, tests should still be added with a `skip` parameter indicating why the particular test is disabled from running.
|
||||
1. Have a look at [`yt_dlp/extractor/common.py`](yt_dlp/extractor/common.py) for possible helper methods and a [detailed description of what your extractor should and may return](yt_dlp/extractor/common.py#L91-L426). Add tests and code for as many as you want.
|
||||
1. Make sure your code follows [yt-dlp coding conventions](#yt-dlp-coding-conventions) and check the code with [flake8](https://flake8.pycqa.org/en/latest/index.html#quickstart):
|
||||
|
||||
$ flake8 yt_dlp/extractor/yourextractor.py
|
||||
|
||||
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython and PyPy for Python 3.7 and above. Backward compatibility is not required for even older versions of Python.
|
||||
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython and PyPy for Python 3.6 and above. Backward compatibility is not required for even older versions of Python.
|
||||
1. When the tests pass, [add](https://git-scm.com/docs/git-add) the new files, [commit](https://git-scm.com/docs/git-commit) them and [push](https://git-scm.com/docs/git-push) the result, like this:
|
||||
|
||||
$ git add yt_dlp/extractor/_extractors.py
|
||||
$ git add yt_dlp/extractor/extractors.py
|
||||
$ git add yt_dlp/extractor/yourextractor.py
|
||||
$ git commit -m '[yourextractor] Add extractor'
|
||||
$ git push origin yourextractor
|
||||
@@ -257,11 +252,7 @@ ### Mandatory and optional metafields
|
||||
- `title` (media title)
|
||||
- `url` (media download URL) or `formats`
|
||||
|
||||
The aforementioned metafields are the critical data that the extraction does not make any sense without and if any of them fail to be extracted then the extractor is considered completely broken. While all extractors must return a `title`, they must also allow it's extraction to be non-fatal.
|
||||
|
||||
For pornographic sites, appropriate `age_limit` must also be returned.
|
||||
|
||||
The extractor is allowed to return the info dict without url or formats in some special cases if it allows the user to extract usefull information with `--ignore-no-formats-error` - e.g. when the video is a live stream that has not started yet.
|
||||
The aforementioned metafields are the critical data that the extraction does not make any sense without and if any of them fail to be extracted then the extractor is considered completely broken. While, in fact, only `id` is technically mandatory, due to compatibility reasons, yt-dlp also treats `title` as mandatory. The extractor is allowed to return the info dict without url or formats in some special cases if it allows the user to extract usefull information with `--ignore-no-formats-error` - Eg: when the video is a live stream that has not started yet.
|
||||
|
||||
[Any field](yt_dlp/extractor/common.py#219-L426) apart from the aforementioned ones are considered **optional**. That means that extraction should be **tolerant** to situations when sources for these fields can potentially be unavailable (even if they are always available at the moment) and **future-proof** in order not to break the extraction of general purpose mandatory fields.
|
||||
|
||||
@@ -300,10 +291,14 @@ #### Example
|
||||
The latter will break extraction process with `KeyError` if `summary` disappears from `meta` at some later time but with the former approach extraction will just go ahead with `description` set to `None` which is perfectly fine (remember `None` is equivalent to the absence of data).
|
||||
|
||||
|
||||
If the data is nested, do not use `.get` chains, but instead make use of `traverse_obj`.
|
||||
If the data is nested, do not use `.get` chains, but instead make use of the utility functions `try_get` or `traverse_obj`
|
||||
|
||||
Considering the above `meta` again, assume you want to extract `["user"]["name"]` and put it in the resulting info dict as `uploader`
|
||||
|
||||
```python
|
||||
uploader = try_get(meta, lambda x: x['user']['name']) # correct
|
||||
```
|
||||
or
|
||||
```python
|
||||
uploader = traverse_obj(meta, ('user', 'name')) # correct
|
||||
```
|
||||
@@ -317,10 +312,6 @@ #### Example
|
||||
```python
|
||||
uploader = meta.get('user', {}).get('name') # incorrect
|
||||
```
|
||||
or
|
||||
```python
|
||||
uploader = try_get(meta, lambda x: x['user']['name']) # old utility
|
||||
```
|
||||
|
||||
|
||||
Similarly, you should pass `fatal=False` when extracting optional data from a webpage with `_search_regex`, `_html_search_regex` or similar methods, for instance:
|
||||
@@ -346,42 +337,26 @@ #### Example
|
||||
|
||||
Another thing to remember is not to try to iterate over `None`
|
||||
|
||||
Say you extracted a list of thumbnails into `thumbnail_data` and want to iterate over them
|
||||
Say you extracted a list of thumbnails into `thumbnail_data` using `try_get` and now want to iterate over them
|
||||
|
||||
```python
|
||||
thumbnail_data = data.get('thumbnails') or []
|
||||
thumbnail_data = try_get(...)
|
||||
thumbnails = [{
|
||||
'url': item['url'],
|
||||
'height': item.get('h'),
|
||||
} for item in thumbnail_data if item.get('url')] # correct
|
||||
'url': item['url']
|
||||
} for item in thumbnail_data or []] # correct
|
||||
```
|
||||
|
||||
and not like:
|
||||
|
||||
```python
|
||||
thumbnail_data = data.get('thumbnails')
|
||||
thumbnail_data = try_get(...)
|
||||
thumbnails = [{
|
||||
'url': item['url'],
|
||||
'height': item.get('h'),
|
||||
'url': item['url']
|
||||
} for item in thumbnail_data] # incorrect
|
||||
```
|
||||
|
||||
In this case, `thumbnail_data` will be `None` if the field was not found and this will cause the loop `for item in thumbnail_data` to raise a fatal error. Using `or []` avoids this error and results in setting an empty list in `thumbnails` instead.
|
||||
In the later case, `thumbnail_data` will be `None` if the field was not found and this will cause the loop `for item in thumbnail_data` to raise a fatal error. Using `for item in thumbnail_data or []` avoids this error and results in setting an empty list in `thumbnails` instead.
|
||||
|
||||
Alternately, this can be further simplified by using `traverse_obj`
|
||||
|
||||
```python
|
||||
thumbnails = [{
|
||||
'url': item['url'],
|
||||
'height': item.get('h'),
|
||||
} for item in traverse_obj(data, ('thumbnails', lambda _, v: v['url']))]
|
||||
```
|
||||
|
||||
or, even better,
|
||||
|
||||
```python
|
||||
thumbnails = traverse_obj(data, ('thumbnails', ..., {'url': 'url', 'height': 'h'}))
|
||||
```
|
||||
|
||||
### Provide fallbacks
|
||||
|
||||
@@ -390,21 +365,21 @@ ### Provide fallbacks
|
||||
|
||||
#### Example
|
||||
|
||||
Say `meta` from the previous example has a `title` and you are about to extract it like:
|
||||
Say `meta` from the previous example has a `title` and you are about to extract it. Since `title` is a mandatory meta field you should end up with something like:
|
||||
|
||||
```python
|
||||
title = meta.get('title')
|
||||
title = meta['title']
|
||||
```
|
||||
|
||||
If `title` disappears from `meta` in future due to some changes on the hoster's side the title extraction would fail.
|
||||
If `title` disappears from `meta` in future due to some changes on the hoster's side the extraction would fail since `title` is mandatory. That's expected.
|
||||
|
||||
Assume that you have some another source you can extract `title` from, for example `og:title` HTML meta of a `webpage`. In this case you can provide a fallback like:
|
||||
Assume that you have some another source you can extract `title` from, for example `og:title` HTML meta of a `webpage`. In this case you can provide a fallback scenario:
|
||||
|
||||
```python
|
||||
title = meta.get('title') or self._og_search_title(webpage)
|
||||
```
|
||||
|
||||
This code will try to extract from `meta` first and if it fails it will try extracting `og:title` from a `webpage`, making the extractor more robust.
|
||||
This code will try to extract from `meta` first and if it fails it will try extracting `og:title` from a `webpage`.
|
||||
|
||||
|
||||
### Regular expressions
|
||||
@@ -447,7 +422,7 @@ ##### Example
|
||||
r'<span[^>]+class="title"[^>]*>([^<]+)', webpage, 'title')
|
||||
```
|
||||
|
||||
which tolerates potential changes in the `style` attribute's value. Or even better:
|
||||
Or even better:
|
||||
|
||||
```python
|
||||
title = self._search_regex( # correct
|
||||
@@ -455,7 +430,7 @@ ##### Example
|
||||
webpage, 'title', group='title')
|
||||
```
|
||||
|
||||
which also handles both single quotes in addition to double quotes.
|
||||
Note how you tolerate potential changes in the `style` attribute's value or switch from using double quotes to single for `class` attribute:
|
||||
|
||||
The code definitely should not look like:
|
||||
|
||||
@@ -473,42 +448,7 @@ ##### Example
|
||||
webpage, 'title', group='title')
|
||||
```
|
||||
|
||||
Here the presence or absence of other attributes including `style` is irrelevant for the data we need, and so the regex must not depend on it
|
||||
|
||||
|
||||
#### Keep the regular expressions as simple as possible, but no simpler
|
||||
|
||||
Since many extractors deal with unstructured data provided by websites, we will often need to use very complex regular expressions. You should try to use the *simplest* regex that can accomplish what you want. In other words, each part of the regex must have a reason for existing. If you can take out a symbol and the functionality does not change, the symbol should not be there.
|
||||
|
||||
##### Example
|
||||
|
||||
Correct:
|
||||
|
||||
```python
|
||||
_VALID_URL = r'https?://(?:www\.)?website\.com/(?:[^/]+/){3,4}(?P<display_id>[^/]+)_(?P<id>\d+)'
|
||||
```
|
||||
|
||||
Incorrect:
|
||||
|
||||
```python
|
||||
_VALID_URL = r'https?:\/\/(?:www\.)?website\.com\/[^\/]+/[^\/]+/[^\/]+(?:\/[^\/]+)?\/(?P<display_id>[^\/]+)_(?P<id>\d+)'
|
||||
```
|
||||
|
||||
#### Do not misuse `.` and use the correct quantifiers (`+*?`)
|
||||
|
||||
Avoid creating regexes that over-match because of wrong use of quantifiers. Also try to avoid non-greedy matching (`?`) where possible since they could easily result in [catastrophic backtracking](https://www.regular-expressions.info/catastrophic.html)
|
||||
|
||||
Correct:
|
||||
|
||||
```python
|
||||
title = self._search_regex(r'<span\b[^>]+class="title"[^>]*>([^<]+)', webpage, 'title')
|
||||
```
|
||||
|
||||
Incorrect:
|
||||
|
||||
```python
|
||||
title = self._search_regex(r'<span\b.*class="title".*>(.+?)<', webpage, 'title')
|
||||
```
|
||||
Here the presence or absence of other attributes including `style` is irrelevent for the data we need, and so the regex must not depend on it
|
||||
|
||||
|
||||
### Long lines policy
|
||||
@@ -517,7 +457,7 @@ ### Long lines policy
|
||||
|
||||
For example, you should **never** split long string literals like URLs or some other often copied entities over multiple lines to fit this limit:
|
||||
|
||||
Conversely, don't unnecessarily split small lines further. As a rule of thumb, if removing the line split keeps the code under 80 characters, it should be a single line.
|
||||
Conversely, don't unecessarily split small lines further. As a rule of thumb, if removing the line split keeps the code under 80 characters, it should be a single line.
|
||||
|
||||
##### Examples
|
||||
|
||||
@@ -572,35 +512,27 @@ ##### Examples
|
||||
|
||||
### Quotes
|
||||
|
||||
Always use single quotes for strings (even if the string has `'`) and double quotes for docstrings. Use `'''` only for multi-line strings. An exception can be made if a string has multiple single quotes in it and escaping makes it *significantly* harder to read. For f-strings, use you can use double quotes on the inside. But avoid f-strings that have too many quotes inside.
|
||||
Always use single quotes for strings (even if the string has `'`) and double quotes for docstrings. Use `'''` only for multi-line strings. An exception can be made if a string has multiple single quotes in it and escaping makes it significantly harder to read. For f-strings, use you can use double quotes on the inside. But avoid f-strings that have too many quotes inside.
|
||||
|
||||
|
||||
### Inline values
|
||||
|
||||
Extracting variables is acceptable for reducing code duplication and improving readability of complex expressions. However, you should avoid extracting variables used only once and moving them to opposite parts of the extractor file, which makes reading the linear flow difficult.
|
||||
|
||||
#### Examples
|
||||
#### Example
|
||||
|
||||
Correct:
|
||||
|
||||
```python
|
||||
return {
|
||||
'title': self._html_search_regex(r'<h1>([^<]+)</h1>', webpage, 'title'),
|
||||
# ...some lines of code...
|
||||
}
|
||||
title = self._html_search_regex(r'<title>([^<]+)</title>', webpage, 'title')
|
||||
```
|
||||
|
||||
Incorrect:
|
||||
|
||||
```python
|
||||
TITLE_RE = r'<h1>([^<]+)</h1>'
|
||||
TITLE_RE = r'<title>([^<]+)</title>'
|
||||
# ...some lines of code...
|
||||
title = self._html_search_regex(TITLE_RE, webpage, 'title')
|
||||
# ...some lines of code...
|
||||
return {
|
||||
'title': title,
|
||||
# ...some lines of code...
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
@@ -632,32 +564,33 @@ #### Example
|
||||
|
||||
### Trailing parentheses
|
||||
|
||||
Always move trailing parentheses used for grouping/functions after the last argument. On the other hand, multi-line literal list/tuple/dict/set should closed be in a new line. Generators and list/dict comprehensions may use either style
|
||||
Always move trailing parentheses used for grouping/functions after the last argument. On the other hand, literal list/tuple/dict/set should closed be in a new line. Generators and list/dict comprehensions may use either style
|
||||
|
||||
#### Examples
|
||||
|
||||
Correct:
|
||||
|
||||
```python
|
||||
url = traverse_obj(info, (
|
||||
'context', 'dispatcher', 'stores', 'VideoTitlePageStore', 'data', 'video', 0, 'VideoUrlSet', 'VideoUrl'), list)
|
||||
url = try_get(
|
||||
info,
|
||||
lambda x: x['ResultSet']['Result'][0]['VideoUrlSet']['VideoUrl'],
|
||||
list)
|
||||
```
|
||||
Correct:
|
||||
|
||||
```python
|
||||
url = traverse_obj(
|
||||
info,
|
||||
('context', 'dispatcher', 'stores', 'VideoTitlePageStore', 'data', 'video', 0, 'VideoUrlSet', 'VideoUrl'),
|
||||
list)
|
||||
url = try_get(info,
|
||||
lambda x: x['ResultSet']['Result'][0]['VideoUrlSet']['VideoUrl'],
|
||||
list)
|
||||
```
|
||||
|
||||
Incorrect:
|
||||
|
||||
```python
|
||||
url = traverse_obj(
|
||||
url = try_get(
|
||||
info,
|
||||
('context', 'dispatcher', 'stores', 'VideoTitlePageStore', 'data', 'video', 0, 'VideoUrlSet', 'VideoUrl'),
|
||||
list
|
||||
lambda x: x['ResultSet']['Result'][0]['VideoUrlSet']['VideoUrl'],
|
||||
list,
|
||||
)
|
||||
```
|
||||
|
||||
@@ -700,28 +633,28 @@ ### Use convenience conversion and parsing functions
|
||||
|
||||
Use `url_or_none` for safe URL processing.
|
||||
|
||||
Use `traverse_obj` and `try_call` (superseeds `dict_get` and `try_get`) for safe metadata extraction from parsed JSON.
|
||||
Use `try_get`, `dict_get` and `traverse_obj` for safe metadata extraction from parsed JSON.
|
||||
|
||||
Use `unified_strdate` for uniform `upload_date` or any `YYYYMMDD` meta field extraction, `unified_timestamp` for uniform `timestamp` extraction, `parse_filesize` for `filesize` extraction, `parse_count` for count meta fields extraction, `parse_resolution`, `parse_duration` for `duration` extraction, `parse_age_limit` for `age_limit` extraction.
|
||||
|
||||
Explore [`yt_dlp/utils.py`](yt_dlp/utils.py) for more useful convenience functions.
|
||||
|
||||
#### Examples
|
||||
#### More examples
|
||||
|
||||
##### Safely extract optional description from parsed JSON
|
||||
```python
|
||||
description = traverse_obj(response, ('result', 'video', 'summary'), expected_type=str)
|
||||
thumbnails = traverse_obj(response, ('result', 'thumbnails', ..., 'url'), expected_type=url_or_none)
|
||||
```
|
||||
|
||||
##### Safely extract more optional metadata
|
||||
```python
|
||||
video = traverse_obj(response, ('result', 'video', 0), default={}, expected_type=dict)
|
||||
description = video.get('summary')
|
||||
duration = float_or_none(video.get('durationMs'), scale=1000)
|
||||
view_count = int_or_none(video.get('views'))
|
||||
```
|
||||
|
||||
|
||||
# My pull request is labeled pending-fixes
|
||||
|
||||
The `pending-fixes` label is added when there are changes requested to a PR. When the necessary changes are made, the label should be removed. However, despite our best efforts, it may sometimes happen that the maintainer did not see the changes or forgot to remove the label. If your PR is still marked as `pending-fixes` a few days after all requested changes have been made, feel free to ping the maintainer who labeled your issue and ask them to re-review and remove the label.
|
||||
|
||||
|
||||
|
||||
|
||||
# EMBEDDING YT-DLP
|
||||
|
||||
223
CONTRIBUTORS
223
CONTRIBUTORS
@@ -3,8 +3,6 @@ shirt-dev (collaborator)
|
||||
coletdjnz/colethedj (collaborator)
|
||||
Ashish0804 (collaborator)
|
||||
nao20010128nao/Lesmiscore (collaborator)
|
||||
bashonly (collaborator)
|
||||
Grub4K (collaborator)
|
||||
h-h-h-h
|
||||
pauldubois98
|
||||
nixxo
|
||||
@@ -148,7 +146,7 @@ chio0hai
|
||||
cntrl-s
|
||||
Deer-Spangle
|
||||
DEvmIb
|
||||
Grabien/MaximVol
|
||||
Grabien
|
||||
j54vc1bk
|
||||
mpeter50
|
||||
mrpapersonic
|
||||
@@ -162,7 +160,7 @@ PilzAdam
|
||||
zmousm
|
||||
iw0nderhow
|
||||
unit193
|
||||
TwoThousandHedgehogs/KathrynElrod
|
||||
TwoThousandHedgehogs
|
||||
Jertzukka
|
||||
cypheron
|
||||
Hyeeji
|
||||
@@ -192,220 +190,3 @@ CreaValix
|
||||
sian1468
|
||||
arkamar
|
||||
hyano
|
||||
KiberInfinity
|
||||
tejing1
|
||||
Bricio
|
||||
lazypete365
|
||||
Aniruddh-J
|
||||
blackgear
|
||||
CplPwnies
|
||||
cyberfox1691
|
||||
FestplattenSchnitzel
|
||||
hatienl0i261299
|
||||
iphoting
|
||||
jakeogh
|
||||
lukasfink1
|
||||
lyz-code
|
||||
marieell
|
||||
mdpauley
|
||||
Mipsters
|
||||
mxmehl
|
||||
ofkz
|
||||
P-reducible
|
||||
pycabbage
|
||||
regarten
|
||||
Ronnnny
|
||||
schn0sch
|
||||
s0u1h
|
||||
MrRawes
|
||||
cffswb
|
||||
danielyli
|
||||
1-Byte
|
||||
mehq
|
||||
dzek69
|
||||
aaearon
|
||||
panatexxa
|
||||
kmark
|
||||
un-def
|
||||
goggle
|
||||
Soebb
|
||||
Fam0r
|
||||
bohwaz
|
||||
dodrian
|
||||
vvto33
|
||||
ca-za
|
||||
connercsbn
|
||||
diegorodriguezv
|
||||
ekangmonyet
|
||||
elyse0
|
||||
evansp
|
||||
GiedriusS
|
||||
HE7086
|
||||
JordanWeatherby
|
||||
m4tu4g
|
||||
MarwenDallel
|
||||
nevack
|
||||
putnam
|
||||
rand-net
|
||||
vertan
|
||||
Wikidepia
|
||||
Yipten
|
||||
moench-tegeder
|
||||
christoph-heinrich
|
||||
HobbyistDev
|
||||
LunarFang416
|
||||
sbor23
|
||||
aurelg
|
||||
adamanldo
|
||||
gamer191
|
||||
vkorablin
|
||||
Burve
|
||||
mnn
|
||||
ZhymabekRoman
|
||||
mozbugbox
|
||||
aejdl
|
||||
ping
|
||||
sqrtNOT
|
||||
bubbleguuum
|
||||
darkxex
|
||||
miseran
|
||||
StefanLobbenmeier
|
||||
crazymoose77756
|
||||
nomevi
|
||||
Brett824
|
||||
pingiun
|
||||
dosy4ev
|
||||
EhtishamSabir
|
||||
Ferdi265
|
||||
FirefoxMetzger
|
||||
ftk
|
||||
lamby
|
||||
llamasblade
|
||||
lockmatrix
|
||||
misaelaguayo
|
||||
odo2063
|
||||
pritam20ps05
|
||||
scy
|
||||
sheerluck
|
||||
AxiosDeminence
|
||||
DjesonPV
|
||||
eren-kemer
|
||||
freezboltz
|
||||
Galiley
|
||||
haobinliang
|
||||
Mehavoid
|
||||
winterbird-code
|
||||
yashkc2025
|
||||
aldoridhoni
|
||||
jacobtruman
|
||||
masta79
|
||||
palewire
|
||||
cgrigis
|
||||
DavidH-2022
|
||||
dfaker
|
||||
jackyyf
|
||||
ohaiibuzzle
|
||||
SamantazFox
|
||||
shreyasminocha
|
||||
tejasa97
|
||||
xenov
|
||||
satan1st
|
||||
0xGodspeed
|
||||
5736d79
|
||||
587021c
|
||||
basrieter
|
||||
Bobscorn
|
||||
CNugteren
|
||||
columndeeply
|
||||
DoubleCouponDay
|
||||
Fabi019
|
||||
GautamMKGarg
|
||||
itachi-19
|
||||
jeroenj
|
||||
josanabr
|
||||
LiviaMedeiros
|
||||
nikita-moor
|
||||
snapdgn
|
||||
SuperSonicHub1
|
||||
tannertechnology
|
||||
Timendum
|
||||
tobi1805
|
||||
TokyoBlackHole
|
||||
ajayyy
|
||||
Alienmaster
|
||||
bsun0000
|
||||
changren-wcr
|
||||
ClosedPort22
|
||||
CrankDatSouljaBoy
|
||||
cruel-efficiency
|
||||
endotronic
|
||||
Generator
|
||||
gibson042
|
||||
How-Bout-No
|
||||
invertico
|
||||
jahway603
|
||||
jwoglom
|
||||
lksj
|
||||
megapro17
|
||||
mlampe
|
||||
MrOctopus
|
||||
nosoop
|
||||
puc9
|
||||
sashashura
|
||||
schnusch
|
||||
SG5
|
||||
the-marenga
|
||||
tkgmomosheep
|
||||
vitkhab
|
||||
glensc
|
||||
synthpop123
|
||||
tntmod54321
|
||||
milkknife
|
||||
Bnyro
|
||||
CapacitorSet
|
||||
stelcodes
|
||||
skbeh
|
||||
muddi900
|
||||
digitall
|
||||
chengzhicn
|
||||
mexus
|
||||
JChris246
|
||||
redraskal
|
||||
Spicadox
|
||||
barsnick
|
||||
docbender
|
||||
KurtBestor
|
||||
Chrissi2812
|
||||
FrederikNS
|
||||
gschizas
|
||||
JC-Chung
|
||||
mzhou
|
||||
OndrejBakan
|
||||
ab4cbef
|
||||
aionescu
|
||||
amra
|
||||
ByteDream
|
||||
carusocr
|
||||
chexxor
|
||||
felixonmars
|
||||
FrankZ85
|
||||
FriedrichRehren
|
||||
gregsadetsky
|
||||
LeoniePhiline
|
||||
LowSuggestion912
|
||||
Matumo
|
||||
OIRNOIR
|
||||
OMEGARAZER
|
||||
oxamun
|
||||
pmitchell86
|
||||
qbnu
|
||||
qulaz
|
||||
rebane2001
|
||||
road-master
|
||||
rohieb
|
||||
sdht0
|
||||
seproDev
|
||||
Hill-98
|
||||
LXYan2333
|
||||
mushbite
|
||||
venkata-krishnas
|
||||
|
||||
1459
Changelog.md
1459
Changelog.md
File diff suppressed because it is too large
Load Diff
@@ -8,7 +8,6 @@ # Collaborators
|
||||
## [pukkandan](https://github.com/pukkandan)
|
||||
|
||||
[](https://ko-fi.com/pukkandan)
|
||||
[](https://github.com/sponsors/pukkandan)
|
||||
|
||||
* Owner of the fork
|
||||
|
||||
@@ -26,16 +25,14 @@ ## [shirt](https://github.com/shirt-dev)
|
||||
|
||||
## [coletdjnz](https://github.com/coletdjnz)
|
||||
|
||||
[](https://github.com/sponsors/coletdjnz)
|
||||
[](https://github.com/sponsors/coletdjnz)
|
||||
|
||||
* Improved plugin architecture
|
||||
* YouTube improvements including: age-gate bypass, private playlists, multiple-clients (to avoid throttling) and a lot of under-the-hood improvements
|
||||
* Added support for new websites YoutubeWebArchive, MainStreaming, PRX, nzherald, Mediaklikk, StarTV etc
|
||||
* Improved/fixed support for Patreon, panopto, gfycat, itv, pbs, SouthParkDE etc
|
||||
* Added support for downloading YoutubeWebArchive videos
|
||||
|
||||
|
||||
|
||||
## [Ashish0804](https://github.com/Ashish0804) <sub><sup>[Inactive]</sup></sub>
|
||||
## [Ashish0804](https://github.com/Ashish0804)
|
||||
|
||||
[](https://ko-fi.com/ashish0804)
|
||||
|
||||
@@ -44,28 +41,10 @@ ## [Ashish0804](https://github.com/Ashish0804) <sub><sup>[Inactive]</sup></sub>
|
||||
* Improved/fixed support for HiDive, HotStar, Hungama, LBRY, LinkedInLearning, Mxplayer, SonyLiv, TV2, Vimeo, VLive etc
|
||||
|
||||
|
||||
## [Lesmiscore](https://github.com/Lesmiscore) <sub><sup>(nao20010128nao)</sup></sub>
|
||||
## [Lesmiscore](https://github.com/Lesmiscore) (nao20010128nao)
|
||||
|
||||
**Bitcoin**: bc1qfd02r007cutfdjwjmyy9w23rjvtls6ncve7r3s
|
||||
**Monacoin**: mona1q3tf7dzvshrhfe3md379xtvt2n22duhglv5dskr
|
||||
|
||||
* Download live from start to end for YouTube
|
||||
* Added support for new websites AbemaTV, mildom, PixivSketch, skeb, radiko, voicy, mirrativ, openrec, whowatch, damtomo, 17.live, mixch etc
|
||||
* Improved/fixed support for fc2, YahooJapanNews, tver, iwara etc
|
||||
|
||||
|
||||
## [bashonly](https://github.com/bashonly)
|
||||
|
||||
* `--update-to`, automated release, nightly builds
|
||||
* `--cookies-from-browser` support for Firefox containers
|
||||
* Added support for new websites Genius, Kick, NBCStations, Triller, VideoKen etc
|
||||
* Improved/fixed support for Anvato, Brightcove, Instagram, ParamountPlus, Reddit, SlidesLive, TikTok, Twitter, Vimeo etc
|
||||
|
||||
|
||||
## [Grub4K](https://github.com/Grub4K)
|
||||
|
||||
[](https://ko-fi.com/Grub4K) [](https://github.com/sponsors/Grub4K)
|
||||
|
||||
* `--update-to`, automated release, nightly builds
|
||||
* Rework internals like `traverse_obj`, various core refactors and bugs fixes
|
||||
* Helped fix crunchyroll, Twitter, wrestleuniverse, wistia, slideslive etc
|
||||
* Added support for new websites mildom, PixivSketch, skeb, radiko, voicy, mirrativ, openrec, whowatch, damtomo, 17.live, mixch etc
|
||||
|
||||
@@ -5,6 +5,5 @@ include README.md
|
||||
include completions/*/*
|
||||
include supportedsites.md
|
||||
include yt-dlp.1
|
||||
include requirements.txt
|
||||
recursive-include devscripts *
|
||||
recursive-include test *
|
||||
|
||||
55
Makefile
55
Makefile
@@ -9,23 +9,20 @@ tar: yt-dlp.tar.gz
|
||||
# Keep this list in sync with MANIFEST.in
|
||||
# intended use: when building a source distribution,
|
||||
# make pypi-files && python setup.py sdist
|
||||
pypi-files: AUTHORS Changelog.md LICENSE README.md README.txt supportedsites \
|
||||
completions yt-dlp.1 requirements.txt setup.cfg devscripts/* test/*
|
||||
pypi-files: AUTHORS Changelog.md LICENSE README.md README.txt supportedsites completions yt-dlp.1 devscripts/* test/*
|
||||
|
||||
.PHONY: all clean install test tar pypi-files completions ot offlinetest codetest supportedsites
|
||||
|
||||
clean-test:
|
||||
rm -rf test/testdata/sigs/player-*.js tmp/ *.annotations.xml *.aria2 *.description *.dump *.frag \
|
||||
rm -rf test/testdata/player-*.js tmp/ *.annotations.xml *.aria2 *.description *.dump *.frag \
|
||||
*.frag.aria2 *.frag.urls *.info.json *.live_chat.json *.meta *.part* *.tmp *.temp *.unknown_video *.ytdl \
|
||||
*.3gp *.ape *.ass *.avi *.desktop *.f4v *.flac *.flv *.gif *.jpeg *.jpg *.m4a *.m4v *.mhtml *.mkv *.mov *.mp3 \
|
||||
*.mp4 *.mpga *.oga *.ogg *.opus *.png *.sbv *.srt *.swf *.swp *.tt *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
|
||||
*.3gp *.ape *.avi *.desktop *.flac *.flv *.jpeg *.jpg *.m4a *.m4v *.mhtml *.mkv *.mov *.mp3 \
|
||||
*.mp4 *.ogg *.opus *.png *.sbv *.srt *.swf *.swp *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
|
||||
clean-dist:
|
||||
rm -rf yt-dlp.1.temp.md yt-dlp.1 README.txt MANIFEST build/ dist/ .coverage cover/ yt-dlp.tar.gz completions/ \
|
||||
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS .mailmap
|
||||
clean-cache:
|
||||
find . \( \
|
||||
-type d -name .pytest_cache -o -type d -name __pycache__ -o -name "*.pyc" -o -name "*.class" \
|
||||
\) -prune -exec rm -rf {} \;
|
||||
find . \( -name "*.pyc" -o -name "*.class" \) -delete
|
||||
|
||||
completion-bash: completions/bash/yt-dlp
|
||||
completion-fish: completions/fish/yt-dlp.fish
|
||||
@@ -33,6 +30,7 @@ completion-zsh: completions/zsh/_yt-dlp
|
||||
lazy-extractors: yt_dlp/extractor/lazy_extractors.py
|
||||
|
||||
PREFIX ?= /usr/local
|
||||
DESTDIR ?= .
|
||||
BINDIR ?= $(PREFIX)/bin
|
||||
MANDIR ?= $(PREFIX)/man
|
||||
SHAREDIR ?= $(PREFIX)/share
|
||||
@@ -45,23 +43,11 @@ SYSCONFDIR = $(shell if [ $(PREFIX) = /usr -o $(PREFIX) = /usr/local ]; then ech
|
||||
MARKDOWN = $(shell if [ `pandoc -v | head -n1 | cut -d" " -f2 | head -c1` = "2" ]; then echo markdown-smart; else echo markdown; fi)
|
||||
|
||||
install: lazy-extractors yt-dlp yt-dlp.1 completions
|
||||
mkdir -p $(DESTDIR)$(BINDIR)
|
||||
install -m755 yt-dlp $(DESTDIR)$(BINDIR)/yt-dlp
|
||||
mkdir -p $(DESTDIR)$(MANDIR)/man1
|
||||
install -m644 yt-dlp.1 $(DESTDIR)$(MANDIR)/man1/yt-dlp.1
|
||||
mkdir -p $(DESTDIR)$(SHAREDIR)/bash-completion/completions
|
||||
install -m644 completions/bash/yt-dlp $(DESTDIR)$(SHAREDIR)/bash-completion/completions/yt-dlp
|
||||
mkdir -p $(DESTDIR)$(SHAREDIR)/zsh/site-functions
|
||||
install -m644 completions/zsh/_yt-dlp $(DESTDIR)$(SHAREDIR)/zsh/site-functions/_yt-dlp
|
||||
mkdir -p $(DESTDIR)$(SHAREDIR)/fish/vendor_completions.d
|
||||
install -m644 completions/fish/yt-dlp.fish $(DESTDIR)$(SHAREDIR)/fish/vendor_completions.d/yt-dlp.fish
|
||||
|
||||
uninstall:
|
||||
rm -f $(DESTDIR)$(BINDIR)/yt-dlp
|
||||
rm -f $(DESTDIR)$(MANDIR)/man1/yt-dlp.1
|
||||
rm -f $(DESTDIR)$(SHAREDIR)/bash-completion/completions/yt-dlp
|
||||
rm -f $(DESTDIR)$(SHAREDIR)/zsh/site-functions/_yt-dlp
|
||||
rm -f $(DESTDIR)$(SHAREDIR)/fish/vendor_completions.d/yt-dlp.fish
|
||||
install -Dm755 yt-dlp $(DESTDIR)$(BINDIR)/yt-dlp
|
||||
install -Dm644 yt-dlp.1 $(DESTDIR)$(MANDIR)/man1/yt-dlp.1
|
||||
install -Dm644 completions/bash/yt-dlp $(DESTDIR)$(SHAREDIR)/bash-completion/completions/yt-dlp
|
||||
install -Dm644 completions/zsh/_yt-dlp $(DESTDIR)$(SHAREDIR)/zsh/site-functions/_yt-dlp
|
||||
install -Dm644 completions/fish/yt-dlp.fish $(DESTDIR)$(SHAREDIR)/fish/vendor_completions.d/yt-dlp.fish
|
||||
|
||||
codetest:
|
||||
flake8 .
|
||||
@@ -73,11 +59,9 @@ test:
|
||||
offlinetest: codetest
|
||||
$(PYTHON) -m pytest -k "not download"
|
||||
|
||||
# XXX: This is hard to maintain
|
||||
CODE_FOLDERS = yt_dlp yt_dlp/downloader yt_dlp/extractor yt_dlp/postprocessor yt_dlp/compat yt_dlp/dependencies
|
||||
yt-dlp: yt_dlp/*.py yt_dlp/*/*.py
|
||||
mkdir -p zip
|
||||
for d in $(CODE_FOLDERS) ; do \
|
||||
for d in yt_dlp yt_dlp/downloader yt_dlp/extractor yt_dlp/postprocessor ; do \
|
||||
mkdir -p zip/$$d ;\
|
||||
cp -pPR $$d/*.py zip/$$d/ ;\
|
||||
done
|
||||
@@ -90,10 +74,10 @@ yt-dlp: yt_dlp/*.py yt_dlp/*/*.py
|
||||
rm yt-dlp.zip
|
||||
chmod a+x yt-dlp
|
||||
|
||||
README.md: yt_dlp/*.py yt_dlp/*/*.py devscripts/make_readme.py
|
||||
COLUMNS=80 $(PYTHON) yt_dlp/__main__.py --ignore-config --help | $(PYTHON) devscripts/make_readme.py
|
||||
README.md: yt_dlp/*.py yt_dlp/*/*.py
|
||||
COLUMNS=80 $(PYTHON) yt_dlp/__main__.py --help | $(PYTHON) devscripts/make_readme.py
|
||||
|
||||
CONTRIBUTING.md: README.md devscripts/make_contributing.py
|
||||
CONTRIBUTING.md: README.md
|
||||
$(PYTHON) devscripts/make_contributing.py README.md CONTRIBUTING.md
|
||||
|
||||
issuetemplates: devscripts/make_issue_template.py .github/ISSUE_TEMPLATE_tmpl/1_broken_site.yml .github/ISSUE_TEMPLATE_tmpl/2_site_support_request.yml .github/ISSUE_TEMPLATE_tmpl/3_site_feature_request.yml .github/ISSUE_TEMPLATE_tmpl/4_bug_report.yml .github/ISSUE_TEMPLATE_tmpl/5_feature_request.yml yt_dlp/version.py
|
||||
@@ -110,7 +94,7 @@ supportedsites:
|
||||
README.txt: README.md
|
||||
pandoc -f $(MARKDOWN) -t plain README.md -o README.txt
|
||||
|
||||
yt-dlp.1: README.md devscripts/prepare_manpage.py
|
||||
yt-dlp.1: README.md
|
||||
$(PYTHON) devscripts/prepare_manpage.py yt-dlp.1.temp.md
|
||||
pandoc -s -f $(MARKDOWN) -t man yt-dlp.1.temp.md -o yt-dlp.1
|
||||
rm -f yt-dlp.1.temp.md
|
||||
@@ -127,26 +111,25 @@ completions/fish/yt-dlp.fish: yt_dlp/*.py yt_dlp/*/*.py devscripts/fish-completi
|
||||
mkdir -p completions/fish
|
||||
$(PYTHON) devscripts/fish-completion.py
|
||||
|
||||
_EXTRACTOR_FILES = $(shell find yt_dlp/extractor -name '*.py' -and -not -name 'lazy_extractors.py')
|
||||
_EXTRACTOR_FILES = $(shell find yt_dlp/extractor -iname '*.py' -and -not -iname 'lazy_extractors.py')
|
||||
yt_dlp/extractor/lazy_extractors.py: devscripts/make_lazy_extractors.py devscripts/lazy_load_template.py $(_EXTRACTOR_FILES)
|
||||
$(PYTHON) devscripts/make_lazy_extractors.py $@
|
||||
|
||||
yt-dlp.tar.gz: all
|
||||
@tar -czf yt-dlp.tar.gz --transform "s|^|yt-dlp/|" --owner 0 --group 0 \
|
||||
@tar -czf $(DESTDIR)/yt-dlp.tar.gz --transform "s|^|yt-dlp/|" --owner 0 --group 0 \
|
||||
--exclude '*.DS_Store' \
|
||||
--exclude '*.kate-swp' \
|
||||
--exclude '*.pyc' \
|
||||
--exclude '*.pyo' \
|
||||
--exclude '*~' \
|
||||
--exclude '__pycache__' \
|
||||
--exclude '.pytest_cache' \
|
||||
--exclude '.git' \
|
||||
-- \
|
||||
README.md supportedsites.md Changelog.md LICENSE \
|
||||
CONTRIBUTING.md Collaborators.md CONTRIBUTORS AUTHORS \
|
||||
Makefile MANIFEST.in yt-dlp.1 README.txt completions \
|
||||
setup.py setup.cfg yt-dlp yt_dlp requirements.txt \
|
||||
devscripts test
|
||||
devscripts test tox.ini pytest.ini
|
||||
|
||||
AUTHORS: .mailmap
|
||||
git shortlog -s -n | cut -f2 | sort > AUTHORS
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
# Empty file needed to make devscripts.utils properly importable from outside
|
||||
@@ -1,12 +1,11 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
from os.path import dirname as dirn
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
sys.path.insert(0, dirn(dirn((os.path.abspath(__file__)))))
|
||||
import yt_dlp
|
||||
|
||||
BASH_COMPLETION_FILE = "completions/bash/yt-dlp"
|
||||
@@ -27,5 +26,5 @@ def build_completion(opt_parser):
|
||||
f.write(filled_template)
|
||||
|
||||
|
||||
parser = yt_dlp.parseOpts(ignore_config_files=True)[0]
|
||||
parser = yt_dlp.parseOpts()[0]
|
||||
build_completion(parser)
|
||||
|
||||
435
devscripts/buildserver.py
Normal file
435
devscripts/buildserver.py
Normal file
@@ -0,0 +1,435 @@
|
||||
# UNUSED
|
||||
|
||||
#!/usr/bin/python3
|
||||
|
||||
import argparse
|
||||
import ctypes
|
||||
import functools
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
import threading
|
||||
import traceback
|
||||
import os.path
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname((os.path.abspath(__file__)))))
|
||||
from yt_dlp.compat import (
|
||||
compat_input,
|
||||
compat_http_server,
|
||||
compat_str,
|
||||
compat_urlparse,
|
||||
)
|
||||
|
||||
# These are not used outside of buildserver.py thus not in compat.py
|
||||
|
||||
try:
|
||||
import winreg as compat_winreg
|
||||
except ImportError: # Python 2
|
||||
import _winreg as compat_winreg
|
||||
|
||||
try:
|
||||
import socketserver as compat_socketserver
|
||||
except ImportError: # Python 2
|
||||
import SocketServer as compat_socketserver
|
||||
|
||||
|
||||
class BuildHTTPServer(compat_socketserver.ThreadingMixIn, compat_http_server.HTTPServer):
|
||||
allow_reuse_address = True
|
||||
|
||||
|
||||
advapi32 = ctypes.windll.advapi32
|
||||
|
||||
SC_MANAGER_ALL_ACCESS = 0xf003f
|
||||
SC_MANAGER_CREATE_SERVICE = 0x02
|
||||
SERVICE_WIN32_OWN_PROCESS = 0x10
|
||||
SERVICE_AUTO_START = 0x2
|
||||
SERVICE_ERROR_NORMAL = 0x1
|
||||
DELETE = 0x00010000
|
||||
SERVICE_STATUS_START_PENDING = 0x00000002
|
||||
SERVICE_STATUS_RUNNING = 0x00000004
|
||||
SERVICE_ACCEPT_STOP = 0x1
|
||||
|
||||
SVCNAME = 'youtubedl_builder'
|
||||
|
||||
LPTSTR = ctypes.c_wchar_p
|
||||
START_CALLBACK = ctypes.WINFUNCTYPE(None, ctypes.c_int, ctypes.POINTER(LPTSTR))
|
||||
|
||||
|
||||
class SERVICE_TABLE_ENTRY(ctypes.Structure):
|
||||
_fields_ = [
|
||||
('lpServiceName', LPTSTR),
|
||||
('lpServiceProc', START_CALLBACK)
|
||||
]
|
||||
|
||||
|
||||
HandlerEx = ctypes.WINFUNCTYPE(
|
||||
ctypes.c_int, # return
|
||||
ctypes.c_int, # dwControl
|
||||
ctypes.c_int, # dwEventType
|
||||
ctypes.c_void_p, # lpEventData,
|
||||
ctypes.c_void_p, # lpContext,
|
||||
)
|
||||
|
||||
|
||||
def _ctypes_array(c_type, py_array):
|
||||
ar = (c_type * len(py_array))()
|
||||
ar[:] = py_array
|
||||
return ar
|
||||
|
||||
|
||||
def win_OpenSCManager():
|
||||
res = advapi32.OpenSCManagerW(None, None, SC_MANAGER_ALL_ACCESS)
|
||||
if not res:
|
||||
raise Exception('Opening service manager failed - '
|
||||
'are you running this as administrator?')
|
||||
return res
|
||||
|
||||
|
||||
def win_install_service(service_name, cmdline):
|
||||
manager = win_OpenSCManager()
|
||||
try:
|
||||
h = advapi32.CreateServiceW(
|
||||
manager, service_name, None,
|
||||
SC_MANAGER_CREATE_SERVICE, SERVICE_WIN32_OWN_PROCESS,
|
||||
SERVICE_AUTO_START, SERVICE_ERROR_NORMAL,
|
||||
cmdline, None, None, None, None, None)
|
||||
if not h:
|
||||
raise OSError('Service creation failed: %s' % ctypes.FormatError())
|
||||
|
||||
advapi32.CloseServiceHandle(h)
|
||||
finally:
|
||||
advapi32.CloseServiceHandle(manager)
|
||||
|
||||
|
||||
def win_uninstall_service(service_name):
|
||||
manager = win_OpenSCManager()
|
||||
try:
|
||||
h = advapi32.OpenServiceW(manager, service_name, DELETE)
|
||||
if not h:
|
||||
raise OSError('Could not find service %s: %s' % (
|
||||
service_name, ctypes.FormatError()))
|
||||
|
||||
try:
|
||||
if not advapi32.DeleteService(h):
|
||||
raise OSError('Deletion failed: %s' % ctypes.FormatError())
|
||||
finally:
|
||||
advapi32.CloseServiceHandle(h)
|
||||
finally:
|
||||
advapi32.CloseServiceHandle(manager)
|
||||
|
||||
|
||||
def win_service_report_event(service_name, msg, is_error=True):
|
||||
with open('C:/sshkeys/log', 'a', encoding='utf-8') as f:
|
||||
f.write(msg + '\n')
|
||||
|
||||
event_log = advapi32.RegisterEventSourceW(None, service_name)
|
||||
if not event_log:
|
||||
raise OSError('Could not report event: %s' % ctypes.FormatError())
|
||||
|
||||
try:
|
||||
type_id = 0x0001 if is_error else 0x0004
|
||||
event_id = 0xc0000000 if is_error else 0x40000000
|
||||
lines = _ctypes_array(LPTSTR, [msg])
|
||||
|
||||
if not advapi32.ReportEventW(
|
||||
event_log, type_id, 0, event_id, None, len(lines), 0,
|
||||
lines, None):
|
||||
raise OSError('Event reporting failed: %s' % ctypes.FormatError())
|
||||
finally:
|
||||
advapi32.DeregisterEventSource(event_log)
|
||||
|
||||
|
||||
def win_service_handler(stop_event, *args):
|
||||
try:
|
||||
raise ValueError('Handler called with args ' + repr(args))
|
||||
TODO
|
||||
except Exception as e:
|
||||
tb = traceback.format_exc()
|
||||
msg = str(e) + '\n' + tb
|
||||
win_service_report_event(service_name, msg, is_error=True)
|
||||
raise
|
||||
|
||||
|
||||
def win_service_set_status(handle, status_code):
|
||||
svcStatus = SERVICE_STATUS()
|
||||
svcStatus.dwServiceType = SERVICE_WIN32_OWN_PROCESS
|
||||
svcStatus.dwCurrentState = status_code
|
||||
svcStatus.dwControlsAccepted = SERVICE_ACCEPT_STOP
|
||||
|
||||
svcStatus.dwServiceSpecificExitCode = 0
|
||||
|
||||
if not advapi32.SetServiceStatus(handle, ctypes.byref(svcStatus)):
|
||||
raise OSError('SetServiceStatus failed: %r' % ctypes.FormatError())
|
||||
|
||||
|
||||
def win_service_main(service_name, real_main, argc, argv_raw):
|
||||
try:
|
||||
# args = [argv_raw[i].value for i in range(argc)]
|
||||
stop_event = threading.Event()
|
||||
handler = HandlerEx(functools.partial(stop_event, win_service_handler))
|
||||
h = advapi32.RegisterServiceCtrlHandlerExW(service_name, handler, None)
|
||||
if not h:
|
||||
raise OSError('Handler registration failed: %s' %
|
||||
ctypes.FormatError())
|
||||
|
||||
TODO
|
||||
except Exception as e:
|
||||
tb = traceback.format_exc()
|
||||
msg = str(e) + '\n' + tb
|
||||
win_service_report_event(service_name, msg, is_error=True)
|
||||
raise
|
||||
|
||||
|
||||
def win_service_start(service_name, real_main):
|
||||
try:
|
||||
cb = START_CALLBACK(
|
||||
functools.partial(win_service_main, service_name, real_main))
|
||||
dispatch_table = _ctypes_array(SERVICE_TABLE_ENTRY, [
|
||||
SERVICE_TABLE_ENTRY(
|
||||
service_name,
|
||||
cb
|
||||
),
|
||||
SERVICE_TABLE_ENTRY(None, ctypes.cast(None, START_CALLBACK))
|
||||
])
|
||||
|
||||
if not advapi32.StartServiceCtrlDispatcherW(dispatch_table):
|
||||
raise OSError('ctypes start failed: %s' % ctypes.FormatError())
|
||||
except Exception as e:
|
||||
tb = traceback.format_exc()
|
||||
msg = str(e) + '\n' + tb
|
||||
win_service_report_event(service_name, msg, is_error=True)
|
||||
raise
|
||||
|
||||
|
||||
def main(args=None):
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('-i', '--install',
|
||||
action='store_const', dest='action', const='install',
|
||||
help='Launch at Windows startup')
|
||||
parser.add_argument('-u', '--uninstall',
|
||||
action='store_const', dest='action', const='uninstall',
|
||||
help='Remove Windows service')
|
||||
parser.add_argument('-s', '--service',
|
||||
action='store_const', dest='action', const='service',
|
||||
help='Run as a Windows service')
|
||||
parser.add_argument('-b', '--bind', metavar='<host:port>',
|
||||
action='store', default='0.0.0.0:8142',
|
||||
help='Bind to host:port (default %default)')
|
||||
options = parser.parse_args(args=args)
|
||||
|
||||
if options.action == 'install':
|
||||
fn = os.path.abspath(__file__).replace('v:', '\\\\vboxsrv\\vbox')
|
||||
cmdline = '%s %s -s -b %s' % (sys.executable, fn, options.bind)
|
||||
win_install_service(SVCNAME, cmdline)
|
||||
return
|
||||
|
||||
if options.action == 'uninstall':
|
||||
win_uninstall_service(SVCNAME)
|
||||
return
|
||||
|
||||
if options.action == 'service':
|
||||
win_service_start(SVCNAME, main)
|
||||
return
|
||||
|
||||
host, port_str = options.bind.split(':')
|
||||
port = int(port_str)
|
||||
|
||||
print('Listening on %s:%d' % (host, port))
|
||||
srv = BuildHTTPServer((host, port), BuildHTTPRequestHandler)
|
||||
thr = threading.Thread(target=srv.serve_forever)
|
||||
thr.start()
|
||||
compat_input('Press ENTER to shut down')
|
||||
srv.shutdown()
|
||||
thr.join()
|
||||
|
||||
|
||||
def rmtree(path):
|
||||
for name in os.listdir(path):
|
||||
fname = os.path.join(path, name)
|
||||
if os.path.isdir(fname):
|
||||
rmtree(fname)
|
||||
else:
|
||||
os.chmod(fname, 0o666)
|
||||
os.remove(fname)
|
||||
os.rmdir(path)
|
||||
|
||||
|
||||
class BuildError(Exception):
|
||||
def __init__(self, output, code=500):
|
||||
self.output = output
|
||||
self.code = code
|
||||
|
||||
def __str__(self):
|
||||
return self.output
|
||||
|
||||
|
||||
class HTTPError(BuildError):
|
||||
pass
|
||||
|
||||
|
||||
class PythonBuilder(object):
|
||||
def __init__(self, **kwargs):
|
||||
python_version = kwargs.pop('python', '3.4')
|
||||
python_path = None
|
||||
for node in ('Wow6432Node\\', ''):
|
||||
try:
|
||||
key = compat_winreg.OpenKey(
|
||||
compat_winreg.HKEY_LOCAL_MACHINE,
|
||||
r'SOFTWARE\%sPython\PythonCore\%s\InstallPath' % (node, python_version))
|
||||
try:
|
||||
python_path, _ = compat_winreg.QueryValueEx(key, '')
|
||||
finally:
|
||||
compat_winreg.CloseKey(key)
|
||||
break
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if not python_path:
|
||||
raise BuildError('No such Python version: %s' % python_version)
|
||||
|
||||
self.pythonPath = python_path
|
||||
|
||||
super(PythonBuilder, self).__init__(**kwargs)
|
||||
|
||||
|
||||
class GITInfoBuilder(object):
|
||||
def __init__(self, **kwargs):
|
||||
try:
|
||||
self.user, self.repoName = kwargs['path'][:2]
|
||||
self.rev = kwargs.pop('rev')
|
||||
except ValueError:
|
||||
raise BuildError('Invalid path')
|
||||
except KeyError as e:
|
||||
raise BuildError('Missing mandatory parameter "%s"' % e.args[0])
|
||||
|
||||
path = os.path.join(os.environ['APPDATA'], 'Build archive', self.repoName, self.user)
|
||||
if not os.path.exists(path):
|
||||
os.makedirs(path)
|
||||
self.basePath = tempfile.mkdtemp(dir=path)
|
||||
self.buildPath = os.path.join(self.basePath, 'build')
|
||||
|
||||
super(GITInfoBuilder, self).__init__(**kwargs)
|
||||
|
||||
|
||||
class GITBuilder(GITInfoBuilder):
|
||||
def build(self):
|
||||
try:
|
||||
subprocess.check_output(['git', 'clone', 'git://github.com/%s/%s.git' % (self.user, self.repoName), self.buildPath])
|
||||
subprocess.check_output(['git', 'checkout', self.rev], cwd=self.buildPath)
|
||||
except subprocess.CalledProcessError as e:
|
||||
raise BuildError(e.output)
|
||||
|
||||
super(GITBuilder, self).build()
|
||||
|
||||
|
||||
class YoutubeDLBuilder(object):
|
||||
authorizedUsers = ['fraca7', 'phihag', 'rg3', 'FiloSottile', 'ytdl-org']
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
if self.repoName != 'yt-dlp':
|
||||
raise BuildError('Invalid repository "%s"' % self.repoName)
|
||||
if self.user not in self.authorizedUsers:
|
||||
raise HTTPError('Unauthorized user "%s"' % self.user, 401)
|
||||
|
||||
super(YoutubeDLBuilder, self).__init__(**kwargs)
|
||||
|
||||
def build(self):
|
||||
try:
|
||||
proc = subprocess.Popen([os.path.join(self.pythonPath, 'python.exe'), 'setup.py', 'py2exe'], stdin=subprocess.PIPE, cwd=self.buildPath)
|
||||
proc.wait()
|
||||
#subprocess.check_output([os.path.join(self.pythonPath, 'python.exe'), 'setup.py', 'py2exe'],
|
||||
# cwd=self.buildPath)
|
||||
except subprocess.CalledProcessError as e:
|
||||
raise BuildError(e.output)
|
||||
|
||||
super(YoutubeDLBuilder, self).build()
|
||||
|
||||
|
||||
class DownloadBuilder(object):
|
||||
def __init__(self, **kwargs):
|
||||
self.handler = kwargs.pop('handler')
|
||||
self.srcPath = os.path.join(self.buildPath, *tuple(kwargs['path'][2:]))
|
||||
self.srcPath = os.path.abspath(os.path.normpath(self.srcPath))
|
||||
if not self.srcPath.startswith(self.buildPath):
|
||||
raise HTTPError(self.srcPath, 401)
|
||||
|
||||
super(DownloadBuilder, self).__init__(**kwargs)
|
||||
|
||||
def build(self):
|
||||
if not os.path.exists(self.srcPath):
|
||||
raise HTTPError('No such file', 404)
|
||||
if os.path.isdir(self.srcPath):
|
||||
raise HTTPError('Is a directory: %s' % self.srcPath, 401)
|
||||
|
||||
self.handler.send_response(200)
|
||||
self.handler.send_header('Content-Type', 'application/octet-stream')
|
||||
self.handler.send_header('Content-Disposition', 'attachment; filename=%s' % os.path.split(self.srcPath)[-1])
|
||||
self.handler.send_header('Content-Length', str(os.stat(self.srcPath).st_size))
|
||||
self.handler.end_headers()
|
||||
|
||||
with open(self.srcPath, 'rb') as src:
|
||||
shutil.copyfileobj(src, self.handler.wfile)
|
||||
|
||||
super(DownloadBuilder, self).build()
|
||||
|
||||
|
||||
class CleanupTempDir(object):
|
||||
def build(self):
|
||||
try:
|
||||
rmtree(self.basePath)
|
||||
except Exception as e:
|
||||
print('WARNING deleting "%s": %s' % (self.basePath, e))
|
||||
|
||||
super(CleanupTempDir, self).build()
|
||||
|
||||
|
||||
class Null(object):
|
||||
def __init__(self, **kwargs):
|
||||
pass
|
||||
|
||||
def start(self):
|
||||
pass
|
||||
|
||||
def close(self):
|
||||
pass
|
||||
|
||||
def build(self):
|
||||
pass
|
||||
|
||||
|
||||
class Builder(PythonBuilder, GITBuilder, YoutubeDLBuilder, DownloadBuilder, CleanupTempDir, Null):
|
||||
pass
|
||||
|
||||
|
||||
class BuildHTTPRequestHandler(compat_http_server.BaseHTTPRequestHandler):
|
||||
actionDict = {'build': Builder, 'download': Builder} # They're the same, no more caching.
|
||||
|
||||
def do_GET(self):
|
||||
path = compat_urlparse.urlparse(self.path)
|
||||
paramDict = dict([(key, value[0]) for key, value in compat_urlparse.parse_qs(path.query).items()])
|
||||
action, _, path = path.path.strip('/').partition('/')
|
||||
if path:
|
||||
path = path.split('/')
|
||||
if action in self.actionDict:
|
||||
try:
|
||||
builder = self.actionDict[action](path=path, handler=self, **paramDict)
|
||||
builder.start()
|
||||
try:
|
||||
builder.build()
|
||||
finally:
|
||||
builder.close()
|
||||
except BuildError as e:
|
||||
self.send_response(e.code)
|
||||
msg = compat_str(e).encode('UTF-8')
|
||||
self.send_header('Content-Type', 'text/plain; charset=UTF-8')
|
||||
self.send_header('Content-Length', len(msg))
|
||||
self.end_headers()
|
||||
self.wfile.write(msg)
|
||||
else:
|
||||
self.send_response(500, 'Unknown build method "%s"' % action)
|
||||
else:
|
||||
self.send_response(500, 'Malformed URL')
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
@@ -1,12 +0,0 @@
|
||||
[
|
||||
{
|
||||
"action": "add",
|
||||
"when": "776d1c3f0c9b00399896dd2e40e78e9a43218109",
|
||||
"short": "[priority] **A new release type has been added!**\n * [`nightly`](https://github.com/yt-dlp/yt-dlp/releases/tag/nightly) builds will be made after each push, containing the latest fixes (but also possibly bugs).\n * When using `--update`/`-U`, a release binary will only update to its current channel (either `stable` or `nightly`).\n * The `--update-to` option has been added allowing the user more control over program upgrades (or downgrades).\n * `--update-to` can change the release channel (`stable`, `nightly`) and also upgrade or downgrade to specific tags.\n * **Usage**: `--update-to CHANNEL`, `--update-to TAG`, `--update-to CHANNEL@TAG`"
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "776d1c3f0c9b00399896dd2e40e78e9a43218109",
|
||||
"short": "[priority] **YouTube throttling fixes!**"
|
||||
}
|
||||
]
|
||||
@@ -1,96 +0,0 @@
|
||||
{
|
||||
"$schema": "http://json-schema.org/draft/2020-12/schema",
|
||||
"type": "array",
|
||||
"uniqueItems": true,
|
||||
"items": {
|
||||
"type": "object",
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"action": {
|
||||
"enum": [
|
||||
"add"
|
||||
]
|
||||
},
|
||||
"when": {
|
||||
"type": "string",
|
||||
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
|
||||
},
|
||||
"hash": {
|
||||
"type": "string",
|
||||
"pattern": "^[0-9a-f]{40}$"
|
||||
},
|
||||
"short": {
|
||||
"type": "string"
|
||||
},
|
||||
"authors": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"action",
|
||||
"short"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"action": {
|
||||
"enum": [
|
||||
"remove"
|
||||
]
|
||||
},
|
||||
"when": {
|
||||
"type": "string",
|
||||
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
|
||||
},
|
||||
"hash": {
|
||||
"type": "string",
|
||||
"pattern": "^[0-9a-f]{40}$"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"action",
|
||||
"hash"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"action": {
|
||||
"enum": [
|
||||
"change"
|
||||
]
|
||||
},
|
||||
"when": {
|
||||
"type": "string",
|
||||
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
|
||||
},
|
||||
"hash": {
|
||||
"type": "string",
|
||||
"pattern": "^[0-9a-f]{40}$"
|
||||
},
|
||||
"short": {
|
||||
"type": "string"
|
||||
},
|
||||
"authors": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"action",
|
||||
"hash",
|
||||
"short",
|
||||
"authors"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
@@ -1,4 +1,6 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
"""
|
||||
This script employs a VERY basic heuristic ('porn' in webpage.lower()) to check
|
||||
if we are not 'age_limit' tagging some porn site
|
||||
@@ -10,14 +12,11 @@
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import urllib.parse
|
||||
import urllib.request
|
||||
|
||||
from test.helper import gettestcases
|
||||
from yt_dlp.utils import compat_urllib_parse_urlparse
|
||||
from yt_dlp.utils import compat_urllib_request
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
METHOD = 'LIST'
|
||||
@@ -28,9 +27,9 @@
|
||||
for test in gettestcases():
|
||||
if METHOD == 'EURISTIC':
|
||||
try:
|
||||
webpage = urllib.request.urlopen(test['url'], timeout=10).read()
|
||||
webpage = compat_urllib_request.urlopen(test['url'], timeout=10).read()
|
||||
except Exception:
|
||||
print('\nFail: {}'.format(test['name']))
|
||||
print('\nFail: {0}'.format(test['name']))
|
||||
continue
|
||||
|
||||
webpage = webpage.decode('utf8', 'replace')
|
||||
@@ -38,9 +37,9 @@
|
||||
RESULT = 'porn' in webpage.lower()
|
||||
|
||||
elif METHOD == 'LIST':
|
||||
domain = urllib.parse.urlparse(test['url']).netloc
|
||||
domain = compat_urllib_parse_urlparse(test['url']).netloc
|
||||
if not domain:
|
||||
print('\nFail: {}'.format(test['name']))
|
||||
print('\nFail: {0}'.format(test['name']))
|
||||
continue
|
||||
domain = '.'.join(domain.split('.')[-2:])
|
||||
|
||||
@@ -48,11 +47,11 @@
|
||||
|
||||
if RESULT and ('info_dict' not in test or 'age_limit' not in test['info_dict']
|
||||
or test['info_dict']['age_limit'] != 18):
|
||||
print('\nPotential missing age_limit check: {}'.format(test['name']))
|
||||
print('\nPotential missing age_limit check: {0}'.format(test['name']))
|
||||
|
||||
elif not RESULT and ('info_dict' in test and 'age_limit' in test['info_dict']
|
||||
and test['info_dict']['age_limit'] == 18):
|
||||
print('\nPotential false negative: {}'.format(test['name']))
|
||||
print('\nPotential false negative: {0}'.format(test['name']))
|
||||
|
||||
else:
|
||||
sys.stdout.write('.')
|
||||
|
||||
112
devscripts/create-github-release.py
Normal file
112
devscripts/create-github-release.py
Normal file
@@ -0,0 +1,112 @@
|
||||
# Unused
|
||||
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import io
|
||||
import json
|
||||
import mimetypes
|
||||
import netrc
|
||||
import optparse
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from yt_dlp.compat import (
|
||||
compat_basestring,
|
||||
compat_getpass,
|
||||
compat_print,
|
||||
compat_urllib_request,
|
||||
)
|
||||
from yt_dlp.utils import (
|
||||
make_HTTPS_handler,
|
||||
sanitized_Request,
|
||||
)
|
||||
|
||||
|
||||
class GitHubReleaser(object):
|
||||
_API_URL = 'https://api.github.com/repos/ytdl-org/youtube-dl/releases'
|
||||
_UPLOADS_URL = 'https://uploads.github.com/repos/ytdl-org/youtube-dl/releases/%s/assets?name=%s'
|
||||
_NETRC_MACHINE = 'github.com'
|
||||
|
||||
def __init__(self, debuglevel=0):
|
||||
self._init_github_account()
|
||||
https_handler = make_HTTPS_handler({}, debuglevel=debuglevel)
|
||||
self._opener = compat_urllib_request.build_opener(https_handler)
|
||||
|
||||
def _init_github_account(self):
|
||||
try:
|
||||
info = netrc.netrc().authenticators(self._NETRC_MACHINE)
|
||||
if info is not None:
|
||||
self._token = info[2]
|
||||
compat_print('Using GitHub credentials found in .netrc...')
|
||||
return
|
||||
else:
|
||||
compat_print('No GitHub credentials found in .netrc')
|
||||
except (IOError, netrc.NetrcParseError):
|
||||
compat_print('Unable to parse .netrc')
|
||||
self._token = compat_getpass(
|
||||
'Type your GitHub PAT (personal access token) and press [Return]: ')
|
||||
|
||||
def _call(self, req):
|
||||
if isinstance(req, compat_basestring):
|
||||
req = sanitized_Request(req)
|
||||
req.add_header('Authorization', 'token %s' % self._token)
|
||||
response = self._opener.open(req).read().decode('utf-8')
|
||||
return json.loads(response)
|
||||
|
||||
def list_releases(self):
|
||||
return self._call(self._API_URL)
|
||||
|
||||
def create_release(self, tag_name, name=None, body='', draft=False, prerelease=False):
|
||||
data = {
|
||||
'tag_name': tag_name,
|
||||
'target_commitish': 'master',
|
||||
'name': name,
|
||||
'body': body,
|
||||
'draft': draft,
|
||||
'prerelease': prerelease,
|
||||
}
|
||||
req = sanitized_Request(self._API_URL, json.dumps(data).encode('utf-8'))
|
||||
return self._call(req)
|
||||
|
||||
def create_asset(self, release_id, asset):
|
||||
asset_name = os.path.basename(asset)
|
||||
url = self._UPLOADS_URL % (release_id, asset_name)
|
||||
# Our files are small enough to be loaded directly into memory.
|
||||
data = open(asset, 'rb').read()
|
||||
req = sanitized_Request(url, data)
|
||||
mime_type, _ = mimetypes.guess_type(asset_name)
|
||||
req.add_header('Content-Type', mime_type or 'application/octet-stream')
|
||||
return self._call(req)
|
||||
|
||||
|
||||
def main():
|
||||
parser = optparse.OptionParser(usage='%prog CHANGELOG VERSION BUILDPATH')
|
||||
options, args = parser.parse_args()
|
||||
if len(args) != 3:
|
||||
parser.error('Expected a version and a build directory')
|
||||
|
||||
changelog_file, version, build_path = args
|
||||
|
||||
with io.open(changelog_file, encoding='utf-8') as inf:
|
||||
changelog = inf.read()
|
||||
|
||||
mobj = re.search(r'(?s)version %s\n{2}(.+?)\n{3}' % version, changelog)
|
||||
body = mobj.group(1) if mobj else ''
|
||||
|
||||
releaser = GitHubReleaser()
|
||||
|
||||
new_release = releaser.create_release(
|
||||
version, name='yt-dlp %s' % version, body=body)
|
||||
release_id = new_release['id']
|
||||
|
||||
for asset in os.listdir(build_path):
|
||||
compat_print('Uploading %s...' % asset)
|
||||
releaser.create_asset(release_id, os.path.join(build_path, asset))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
@@ -1,14 +1,12 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import optparse
|
||||
import os
|
||||
from os.path import dirname as dirn
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, dirn(dirn((os.path.abspath(__file__)))))
|
||||
import yt_dlp
|
||||
from yt_dlp.utils import shell_quote
|
||||
|
||||
@@ -48,5 +46,5 @@ def build_completion(opt_parser):
|
||||
f.write(filled_template)
|
||||
|
||||
|
||||
parser = yt_dlp.parseOpts(ignore_config_files=True)[0]
|
||||
parser = yt_dlp.parseOpts()[0]
|
||||
build_completion(parser)
|
||||
|
||||
@@ -1,17 +1,15 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import codecs
|
||||
import subprocess
|
||||
|
||||
from yt_dlp.aes import aes_encrypt, key_expansion
|
||||
import os
|
||||
import sys
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from yt_dlp.utils import intlist_to_bytes
|
||||
from yt_dlp.aes import aes_encrypt, key_expansion
|
||||
|
||||
secret_msg = b'Secret message goes here'
|
||||
|
||||
|
||||
43
devscripts/gh-pages.unused/add-version.py
Normal file
43
devscripts/gh-pages.unused/add-version.py
Normal file
@@ -0,0 +1,43 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import json
|
||||
import sys
|
||||
import hashlib
|
||||
import os.path
|
||||
|
||||
|
||||
if len(sys.argv) <= 1:
|
||||
print('Specify the version number as parameter')
|
||||
sys.exit()
|
||||
version = sys.argv[1]
|
||||
|
||||
with open('update/LATEST_VERSION', 'w') as f:
|
||||
f.write(version)
|
||||
|
||||
versions_info = json.load(open('update/versions.json'))
|
||||
if 'signature' in versions_info:
|
||||
del versions_info['signature']
|
||||
|
||||
new_version = {}
|
||||
|
||||
filenames = {
|
||||
'bin': 'yt-dlp',
|
||||
'exe': 'yt-dlp.exe',
|
||||
'tar': 'yt-dlp-%s.tar.gz' % version}
|
||||
build_dir = os.path.join('..', '..', 'build', version)
|
||||
for key, filename in filenames.items():
|
||||
url = 'https://yt-dl.org/downloads/%s/%s' % (version, filename)
|
||||
fn = os.path.join(build_dir, filename)
|
||||
with open(fn, 'rb') as f:
|
||||
data = f.read()
|
||||
if not data:
|
||||
raise ValueError('File %s is empty!' % fn)
|
||||
sha256sum = hashlib.sha256(data).hexdigest()
|
||||
new_version[key] = (url, sha256sum)
|
||||
|
||||
versions_info['versions'][version] = new_version
|
||||
versions_info['latest'] = version
|
||||
|
||||
with open('update/versions.json', 'w') as jsonf:
|
||||
json.dump(versions_info, jsonf, indent=4, sort_keys=True)
|
||||
22
devscripts/gh-pages.unused/generate-download.py
Normal file
22
devscripts/gh-pages.unused/generate-download.py
Normal file
@@ -0,0 +1,22 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import json
|
||||
|
||||
versions_info = json.load(open('update/versions.json'))
|
||||
version = versions_info['latest']
|
||||
version_dict = versions_info['versions'][version]
|
||||
|
||||
# Read template page
|
||||
with open('download.html.in', 'r', encoding='utf-8') as tmplf:
|
||||
template = tmplf.read()
|
||||
|
||||
template = template.replace('@PROGRAM_VERSION@', version)
|
||||
template = template.replace('@PROGRAM_URL@', version_dict['bin'][0])
|
||||
template = template.replace('@PROGRAM_SHA256SUM@', version_dict['bin'][1])
|
||||
template = template.replace('@EXE_URL@', version_dict['exe'][0])
|
||||
template = template.replace('@EXE_SHA256SUM@', version_dict['exe'][1])
|
||||
template = template.replace('@TAR_URL@', version_dict['tar'][0])
|
||||
template = template.replace('@TAR_SHA256SUM@', version_dict['tar'][1])
|
||||
with open('download.html', 'w', encoding='utf-8') as dlf:
|
||||
dlf.write(template)
|
||||
34
devscripts/gh-pages.unused/sign-versions.py
Normal file
34
devscripts/gh-pages.unused/sign-versions.py
Normal file
@@ -0,0 +1,34 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals, with_statement
|
||||
|
||||
import rsa
|
||||
import json
|
||||
from binascii import hexlify
|
||||
|
||||
try:
|
||||
input = raw_input
|
||||
except NameError:
|
||||
pass
|
||||
|
||||
versions_info = json.load(open('update/versions.json'))
|
||||
if 'signature' in versions_info:
|
||||
del versions_info['signature']
|
||||
|
||||
print('Enter the PKCS1 private key, followed by a blank line:')
|
||||
privkey = b''
|
||||
while True:
|
||||
try:
|
||||
line = input()
|
||||
except EOFError:
|
||||
break
|
||||
if line == '':
|
||||
break
|
||||
privkey += line.encode('ascii') + b'\n'
|
||||
privkey = rsa.PrivateKey.load_pkcs1(privkey)
|
||||
|
||||
signature = hexlify(rsa.pkcs1.sign(json.dumps(versions_info, sort_keys=True).encode('utf-8'), privkey, 'SHA-256')).decode()
|
||||
print('signature: ' + signature)
|
||||
|
||||
versions_info['signature'] = signature
|
||||
with open('update/versions.json', 'w') as versionsf:
|
||||
json.dump(versions_info, versionsf, indent=4, sort_keys=True)
|
||||
21
devscripts/gh-pages.unused/update-copyright.py
Normal file
21
devscripts/gh-pages.unused/update-copyright.py
Normal file
@@ -0,0 +1,21 @@
|
||||
#!/usr/bin/env python3
|
||||
# coding: utf-8
|
||||
|
||||
from __future__ import with_statement, unicode_literals
|
||||
|
||||
import datetime
|
||||
import glob
|
||||
import io # For Python 2 compatibility
|
||||
import os
|
||||
import re
|
||||
|
||||
year = str(datetime.datetime.now().year)
|
||||
for fn in glob.glob('*.html*'):
|
||||
with io.open(fn, encoding='utf-8') as f:
|
||||
content = f.read()
|
||||
newc = re.sub(r'(?P<copyright>Copyright © 2011-)(?P<year>[0-9]{4})', 'Copyright © 2011-' + year, content)
|
||||
if content != newc:
|
||||
tmpFn = fn + '.part'
|
||||
with io.open(tmpFn, 'wt', encoding='utf-8') as outf:
|
||||
outf.write(newc)
|
||||
os.rename(tmpFn, fn)
|
||||
76
devscripts/gh-pages.unused/update-feed.py
Normal file
76
devscripts/gh-pages.unused/update-feed.py
Normal file
@@ -0,0 +1,76 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import datetime
|
||||
import io
|
||||
import json
|
||||
import textwrap
|
||||
|
||||
|
||||
atom_template = textwrap.dedent("""\
|
||||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<feed xmlns="http://www.w3.org/2005/Atom">
|
||||
<link rel="self" href="http://ytdl-org.github.io/youtube-dl/update/releases.atom" />
|
||||
<title>yt-dlp releases</title>
|
||||
<id>https://yt-dl.org/feed/yt-dlp-updates-feed</id>
|
||||
<updated>@TIMESTAMP@</updated>
|
||||
@ENTRIES@
|
||||
</feed>""")
|
||||
|
||||
entry_template = textwrap.dedent("""
|
||||
<entry>
|
||||
<id>https://yt-dl.org/feed/yt-dlp-updates-feed/yt-dlp-@VERSION@</id>
|
||||
<title>New version @VERSION@</title>
|
||||
<link href="http://ytdl-org.github.io/yt-dlp" />
|
||||
<content type="xhtml">
|
||||
<div xmlns="http://www.w3.org/1999/xhtml">
|
||||
Downloads available at <a href="https://yt-dl.org/downloads/@VERSION@/">https://yt-dl.org/downloads/@VERSION@/</a>
|
||||
</div>
|
||||
</content>
|
||||
<author>
|
||||
<name>The yt-dlp maintainers</name>
|
||||
</author>
|
||||
<updated>@TIMESTAMP@</updated>
|
||||
</entry>
|
||||
""")
|
||||
|
||||
now = datetime.datetime.now()
|
||||
now_iso = now.isoformat() + 'Z'
|
||||
|
||||
atom_template = atom_template.replace('@TIMESTAMP@', now_iso)
|
||||
|
||||
versions_info = json.load(open('update/versions.json'))
|
||||
versions = list(versions_info['versions'].keys())
|
||||
versions.sort()
|
||||
|
||||
entries = []
|
||||
for v in versions:
|
||||
fields = v.split('.')
|
||||
year, month, day = map(int, fields[:3])
|
||||
faked = 0
|
||||
patchlevel = 0
|
||||
while True:
|
||||
try:
|
||||
datetime.date(year, month, day)
|
||||
except ValueError:
|
||||
day -= 1
|
||||
faked += 1
|
||||
assert day > 0
|
||||
continue
|
||||
break
|
||||
if len(fields) >= 4:
|
||||
try:
|
||||
patchlevel = int(fields[3])
|
||||
except ValueError:
|
||||
patchlevel = 1
|
||||
timestamp = '%04d-%02d-%02dT00:%02d:%02dZ' % (year, month, day, faked, patchlevel)
|
||||
|
||||
entry = entry_template.replace('@TIMESTAMP@', timestamp)
|
||||
entry = entry.replace('@VERSION@', v)
|
||||
entries.append(entry)
|
||||
|
||||
entries_str = textwrap.indent(''.join(entries), '\t')
|
||||
atom_template = atom_template.replace('@ENTRIES@', entries_str)
|
||||
|
||||
with io.open('update/releases.atom', 'w', encoding='utf-8') as atom_file:
|
||||
atom_file.write(atom_template)
|
||||
37
devscripts/gh-pages.unused/update-sites.py
Normal file
37
devscripts/gh-pages.unused/update-sites.py
Normal file
@@ -0,0 +1,37 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import sys
|
||||
import os
|
||||
import textwrap
|
||||
|
||||
# We must be able to import yt_dlp
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))))
|
||||
|
||||
import yt_dlp
|
||||
|
||||
|
||||
def main():
|
||||
with open('supportedsites.html.in', 'r', encoding='utf-8') as tmplf:
|
||||
template = tmplf.read()
|
||||
|
||||
ie_htmls = []
|
||||
for ie in yt_dlp.list_extractors(age_limit=None):
|
||||
ie_html = '<b>{}</b>'.format(ie.IE_NAME)
|
||||
ie_desc = getattr(ie, 'IE_DESC', None)
|
||||
if ie_desc is False:
|
||||
continue
|
||||
elif ie_desc is not None:
|
||||
ie_html += ': {}'.format(ie.IE_DESC)
|
||||
if not ie.working():
|
||||
ie_html += ' (Currently broken)'
|
||||
ie_htmls.append('<li>{}</li>'.format(ie_html))
|
||||
|
||||
template = template.replace('@SITES@', textwrap.indent('\n'.join(ie_htmls), '\t'))
|
||||
|
||||
with open('supportedsites.html', 'w', encoding='utf-8') as sitesf:
|
||||
sitesf.write(template)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
@@ -1,38 +1,31 @@
|
||||
import importlib
|
||||
import random
|
||||
# coding: utf-8
|
||||
import re
|
||||
|
||||
from ..utils import (
|
||||
age_restricted,
|
||||
bug_reports_message,
|
||||
classproperty,
|
||||
write_string,
|
||||
)
|
||||
|
||||
# These bloat the lazy_extractors, so allow them to passthrough silently
|
||||
ALLOWED_CLASSMETHODS = {'extract_from_webpage', 'get_testcases', 'get_webpage_testcases'}
|
||||
_WARNED = False
|
||||
from ..utils import bug_reports_message, write_string
|
||||
|
||||
|
||||
class LazyLoadMetaClass(type):
|
||||
def __getattr__(cls, name):
|
||||
global _WARNED
|
||||
if ('_real_class' not in cls.__dict__
|
||||
and name not in ALLOWED_CLASSMETHODS and not _WARNED):
|
||||
_WARNED = True
|
||||
write_string('WARNING: Falling back to normal extractor since lazy extractor '
|
||||
f'{cls.__name__} does not have attribute {name}{bug_reports_message()}\n')
|
||||
return getattr(cls.real_class, name)
|
||||
if '_real_class' not in cls.__dict__:
|
||||
write_string(
|
||||
f'WARNING: Falling back to normal extractor since lazy extractor '
|
||||
f'{cls.__name__} does not have attribute {name}{bug_reports_message()}')
|
||||
return getattr(cls._get_real_class(), name)
|
||||
|
||||
|
||||
class LazyLoadExtractor(metaclass=LazyLoadMetaClass):
|
||||
@classproperty
|
||||
def real_class(cls):
|
||||
_module = None
|
||||
_WORKING = True
|
||||
|
||||
@classmethod
|
||||
def _get_real_class(cls):
|
||||
if '_real_class' not in cls.__dict__:
|
||||
cls._real_class = getattr(importlib.import_module(cls._module), cls.__name__)
|
||||
mod = __import__(cls._module, fromlist=(cls.__name__,))
|
||||
cls._real_class = getattr(mod, cls.__name__)
|
||||
return cls._real_class
|
||||
|
||||
def __new__(cls, *args, **kwargs):
|
||||
instance = cls.real_class.__new__(cls.real_class)
|
||||
real_cls = cls._get_real_class()
|
||||
instance = real_cls.__new__(real_cls)
|
||||
instance.__init__(*args, **kwargs)
|
||||
return instance
|
||||
|
||||
@@ -1,470 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
import enum
|
||||
import itertools
|
||||
import json
|
||||
import logging
|
||||
import re
|
||||
from collections import defaultdict
|
||||
from dataclasses import dataclass
|
||||
from functools import lru_cache
|
||||
from pathlib import Path
|
||||
|
||||
from devscripts.utils import read_file, run_process, write_file
|
||||
|
||||
BASE_URL = 'https://github.com'
|
||||
LOCATION_PATH = Path(__file__).parent
|
||||
HASH_LENGTH = 7
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class CommitGroup(enum.Enum):
|
||||
UPSTREAM = None
|
||||
PRIORITY = 'Important'
|
||||
CORE = 'Core'
|
||||
EXTRACTOR = 'Extractor'
|
||||
DOWNLOADER = 'Downloader'
|
||||
POSTPROCESSOR = 'Postprocessor'
|
||||
MISC = 'Misc.'
|
||||
|
||||
@classmethod
|
||||
@lru_cache
|
||||
def commit_lookup(cls):
|
||||
return {
|
||||
name: group
|
||||
for group, names in {
|
||||
cls.PRIORITY: {''},
|
||||
cls.UPSTREAM: {'upstream'},
|
||||
cls.CORE: {
|
||||
'aes',
|
||||
'cache',
|
||||
'compat_utils',
|
||||
'compat',
|
||||
'cookies',
|
||||
'core',
|
||||
'dependencies',
|
||||
'jsinterp',
|
||||
'outtmpl',
|
||||
'plugins',
|
||||
'update',
|
||||
'utils',
|
||||
},
|
||||
cls.MISC: {
|
||||
'build',
|
||||
'cleanup',
|
||||
'devscripts',
|
||||
'docs',
|
||||
'misc',
|
||||
'test',
|
||||
},
|
||||
cls.EXTRACTOR: {'extractor', 'extractors'},
|
||||
cls.DOWNLOADER: {'downloader'},
|
||||
cls.POSTPROCESSOR: {'postprocessor'},
|
||||
}.items()
|
||||
for name in names
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def get(cls, value):
|
||||
result = cls.commit_lookup().get(value)
|
||||
if result:
|
||||
logger.debug(f'Mapped {value!r} => {result.name}')
|
||||
return result
|
||||
|
||||
|
||||
@dataclass
|
||||
class Commit:
|
||||
hash: str | None
|
||||
short: str
|
||||
authors: list[str]
|
||||
|
||||
def __str__(self):
|
||||
result = f'{self.short!r}'
|
||||
|
||||
if self.hash:
|
||||
result += f' ({self.hash[:HASH_LENGTH]})'
|
||||
|
||||
if self.authors:
|
||||
authors = ', '.join(self.authors)
|
||||
result += f' by {authors}'
|
||||
|
||||
return result
|
||||
|
||||
|
||||
@dataclass
|
||||
class CommitInfo:
|
||||
details: str | None
|
||||
sub_details: tuple[str, ...]
|
||||
message: str
|
||||
issues: list[str]
|
||||
commit: Commit
|
||||
fixes: list[Commit]
|
||||
|
||||
def key(self):
|
||||
return ((self.details or '').lower(), self.sub_details, self.message)
|
||||
|
||||
|
||||
class Changelog:
|
||||
MISC_RE = re.compile(r'(?:^|\b)(?:lint(?:ing)?|misc|format(?:ting)?|fixes)(?:\b|$)', re.IGNORECASE)
|
||||
|
||||
def __init__(self, groups, repo):
|
||||
self._groups = groups
|
||||
self._repo = repo
|
||||
|
||||
def __str__(self):
|
||||
return '\n'.join(self._format_groups(self._groups)).replace('\t', ' ')
|
||||
|
||||
def _format_groups(self, groups):
|
||||
for item in CommitGroup:
|
||||
group = groups[item]
|
||||
if group:
|
||||
yield self.format_module(item.value, group)
|
||||
|
||||
def format_module(self, name, group):
|
||||
result = f'\n#### {name} changes\n' if name else '\n'
|
||||
return result + '\n'.join(self._format_group(group))
|
||||
|
||||
def _format_group(self, group):
|
||||
sorted_group = sorted(group, key=CommitInfo.key)
|
||||
detail_groups = itertools.groupby(sorted_group, lambda item: (item.details or '').lower())
|
||||
for _, items in detail_groups:
|
||||
items = list(items)
|
||||
details = items[0].details
|
||||
if not details:
|
||||
indent = ''
|
||||
else:
|
||||
yield f'- {details}'
|
||||
indent = '\t'
|
||||
|
||||
if details == 'cleanup':
|
||||
items, cleanup_misc_items = self._filter_cleanup_misc_items(items)
|
||||
|
||||
sub_detail_groups = itertools.groupby(items, lambda item: tuple(map(str.lower, item.sub_details)))
|
||||
for sub_details, entries in sub_detail_groups:
|
||||
if not sub_details:
|
||||
for entry in entries:
|
||||
yield f'{indent}- {self.format_single_change(entry)}'
|
||||
continue
|
||||
|
||||
entries = list(entries)
|
||||
prefix = f'{indent}- {", ".join(entries[0].sub_details)}'
|
||||
if len(entries) == 1:
|
||||
yield f'{prefix}: {self.format_single_change(entries[0])}'
|
||||
continue
|
||||
|
||||
yield prefix
|
||||
for entry in entries:
|
||||
yield f'{indent}\t- {self.format_single_change(entry)}'
|
||||
|
||||
if details == 'cleanup' and cleanup_misc_items:
|
||||
yield from self._format_cleanup_misc_sub_group(cleanup_misc_items)
|
||||
|
||||
def _filter_cleanup_misc_items(self, items):
|
||||
cleanup_misc_items = defaultdict(list)
|
||||
non_misc_items = []
|
||||
for item in items:
|
||||
if self.MISC_RE.search(item.message):
|
||||
cleanup_misc_items[tuple(item.commit.authors)].append(item)
|
||||
else:
|
||||
non_misc_items.append(item)
|
||||
|
||||
return non_misc_items, cleanup_misc_items
|
||||
|
||||
def _format_cleanup_misc_sub_group(self, group):
|
||||
prefix = '\t- Miscellaneous'
|
||||
if len(group) == 1:
|
||||
yield f'{prefix}: {next(self._format_cleanup_misc_items(group))}'
|
||||
return
|
||||
|
||||
yield prefix
|
||||
for message in self._format_cleanup_misc_items(group):
|
||||
yield f'\t\t- {message}'
|
||||
|
||||
def _format_cleanup_misc_items(self, group):
|
||||
for authors, infos in group.items():
|
||||
message = ', '.join(
|
||||
self._format_message_link(None, info.commit.hash)
|
||||
for info in sorted(infos, key=lambda item: item.commit.hash or ''))
|
||||
yield f'{message} by {self._format_authors(authors)}'
|
||||
|
||||
def format_single_change(self, info):
|
||||
message = self._format_message_link(info.message, info.commit.hash)
|
||||
if info.issues:
|
||||
message = f'{message} ({self._format_issues(info.issues)})'
|
||||
|
||||
if info.commit.authors:
|
||||
message = f'{message} by {self._format_authors(info.commit.authors)}'
|
||||
|
||||
if info.fixes:
|
||||
fix_message = ', '.join(f'{self._format_message_link(None, fix.hash)}' for fix in info.fixes)
|
||||
|
||||
authors = sorted({author for fix in info.fixes for author in fix.authors}, key=str.casefold)
|
||||
if authors != info.commit.authors:
|
||||
fix_message = f'{fix_message} by {self._format_authors(authors)}'
|
||||
|
||||
message = f'{message} (With fixes in {fix_message})'
|
||||
|
||||
return message
|
||||
|
||||
def _format_message_link(self, message, hash):
|
||||
assert message or hash, 'Improperly defined commit message or override'
|
||||
message = message if message else hash[:HASH_LENGTH]
|
||||
return f'[{message}]({self.repo_url}/commit/{hash})' if hash else message
|
||||
|
||||
def _format_issues(self, issues):
|
||||
return ', '.join(f'[#{issue}]({self.repo_url}/issues/{issue})' for issue in issues)
|
||||
|
||||
@staticmethod
|
||||
def _format_authors(authors):
|
||||
return ', '.join(f'[{author}]({BASE_URL}/{author})' for author in authors)
|
||||
|
||||
@property
|
||||
def repo_url(self):
|
||||
return f'{BASE_URL}/{self._repo}'
|
||||
|
||||
|
||||
class CommitRange:
|
||||
COMMAND = 'git'
|
||||
COMMIT_SEPARATOR = '-----'
|
||||
|
||||
AUTHOR_INDICATOR_RE = re.compile(r'Authored by:? ', re.IGNORECASE)
|
||||
MESSAGE_RE = re.compile(r'''
|
||||
(?:\[
|
||||
(?P<prefix>[^\]\/:,]+)
|
||||
(?:/(?P<details>[^\]:,]+))?
|
||||
(?:[:,](?P<sub_details>[^\]]+))?
|
||||
\]\ )?
|
||||
(?:(?P<sub_details_alt>`?[^:`]+`?): )?
|
||||
(?P<message>.+?)
|
||||
(?:\ \((?P<issues>\#\d+(?:,\ \#\d+)*)\))?
|
||||
''', re.VERBOSE | re.DOTALL)
|
||||
EXTRACTOR_INDICATOR_RE = re.compile(r'(?:Fix|Add)\s+Extractors?', re.IGNORECASE)
|
||||
FIXES_RE = re.compile(r'(?i:Fix(?:es)?(?:\s+bugs?)?(?:\s+in|\s+for)?|Revert)\s+([\da-f]{40})')
|
||||
UPSTREAM_MERGE_RE = re.compile(r'Update to ytdl-commit-([\da-f]+)')
|
||||
|
||||
def __init__(self, start, end, default_author=None):
|
||||
self._start, self._end = start, end
|
||||
self._commits, self._fixes = self._get_commits_and_fixes(default_author)
|
||||
self._commits_added = []
|
||||
|
||||
def __iter__(self):
|
||||
return iter(itertools.chain(self._commits.values(), self._commits_added))
|
||||
|
||||
def __len__(self):
|
||||
return len(self._commits) + len(self._commits_added)
|
||||
|
||||
def __contains__(self, commit):
|
||||
if isinstance(commit, Commit):
|
||||
if not commit.hash:
|
||||
return False
|
||||
commit = commit.hash
|
||||
|
||||
return commit in self._commits
|
||||
|
||||
def _get_commits_and_fixes(self, default_author):
|
||||
result = run_process(
|
||||
self.COMMAND, 'log', f'--format=%H%n%s%n%b%n{self.COMMIT_SEPARATOR}',
|
||||
f'{self._start}..{self._end}' if self._start else self._end).stdout
|
||||
|
||||
commits = {}
|
||||
fixes = defaultdict(list)
|
||||
lines = iter(result.splitlines(False))
|
||||
for i, commit_hash in enumerate(lines):
|
||||
short = next(lines)
|
||||
skip = short.startswith('Release ') or short == '[version] update'
|
||||
|
||||
authors = [default_author] if default_author else []
|
||||
for line in iter(lambda: next(lines), self.COMMIT_SEPARATOR):
|
||||
match = self.AUTHOR_INDICATOR_RE.match(line)
|
||||
if match:
|
||||
authors = sorted(map(str.strip, line[match.end():].split(',')), key=str.casefold)
|
||||
|
||||
commit = Commit(commit_hash, short, authors)
|
||||
if skip and (self._start or not i):
|
||||
logger.debug(f'Skipped commit: {commit}')
|
||||
continue
|
||||
elif skip:
|
||||
logger.debug(f'Reached Release commit, breaking: {commit}')
|
||||
break
|
||||
|
||||
fix_match = self.FIXES_RE.search(commit.short)
|
||||
if fix_match:
|
||||
commitish = fix_match.group(1)
|
||||
fixes[commitish].append(commit)
|
||||
|
||||
commits[commit.hash] = commit
|
||||
|
||||
for commitish, fix_commits in fixes.items():
|
||||
if commitish in commits:
|
||||
hashes = ', '.join(commit.hash[:HASH_LENGTH] for commit in fix_commits)
|
||||
logger.info(f'Found fix(es) for {commitish[:HASH_LENGTH]}: {hashes}')
|
||||
for fix_commit in fix_commits:
|
||||
del commits[fix_commit.hash]
|
||||
else:
|
||||
logger.debug(f'Commit with fixes not in changes: {commitish[:HASH_LENGTH]}')
|
||||
|
||||
return commits, fixes
|
||||
|
||||
def apply_overrides(self, overrides):
|
||||
for override in overrides:
|
||||
when = override.get('when')
|
||||
if when and when not in self and when != self._start:
|
||||
logger.debug(f'Ignored {when!r}, not in commits {self._start!r}')
|
||||
continue
|
||||
|
||||
override_hash = override.get('hash')
|
||||
if override['action'] == 'add':
|
||||
commit = Commit(override.get('hash'), override['short'], override.get('authors') or [])
|
||||
logger.info(f'ADD {commit}')
|
||||
self._commits_added.append(commit)
|
||||
|
||||
elif override['action'] == 'remove':
|
||||
if override_hash in self._commits:
|
||||
logger.info(f'REMOVE {self._commits[override_hash]}')
|
||||
del self._commits[override_hash]
|
||||
|
||||
elif override['action'] == 'change':
|
||||
if override_hash not in self._commits:
|
||||
continue
|
||||
commit = Commit(override_hash, override['short'], override['authors'])
|
||||
logger.info(f'CHANGE {self._commits[commit.hash]} -> {commit}')
|
||||
self._commits[commit.hash] = commit
|
||||
|
||||
self._commits = {key: value for key, value in reversed(self._commits.items())}
|
||||
|
||||
def groups(self):
|
||||
groups = defaultdict(list)
|
||||
for commit in self:
|
||||
upstream_re = self.UPSTREAM_MERGE_RE.match(commit.short)
|
||||
if upstream_re:
|
||||
commit.short = f'[upstream] Merge up to youtube-dl {upstream_re.group(1)}'
|
||||
|
||||
match = self.MESSAGE_RE.fullmatch(commit.short)
|
||||
if not match:
|
||||
logger.error(f'Error parsing short commit message: {commit.short!r}')
|
||||
continue
|
||||
|
||||
prefix, details, sub_details, sub_details_alt, message, issues = match.groups()
|
||||
group = None
|
||||
if prefix:
|
||||
if prefix == 'priority':
|
||||
prefix, _, details = (details or '').partition('/')
|
||||
logger.debug(f'Priority: {message!r}')
|
||||
group = CommitGroup.PRIORITY
|
||||
|
||||
if not details and prefix:
|
||||
if prefix not in ('core', 'downloader', 'extractor', 'misc', 'postprocessor', 'upstream'):
|
||||
logger.debug(f'Replaced details with {prefix!r}')
|
||||
details = prefix or None
|
||||
|
||||
if details == 'common':
|
||||
details = None
|
||||
|
||||
if details:
|
||||
details = details.strip()
|
||||
|
||||
else:
|
||||
group = CommitGroup.CORE
|
||||
|
||||
sub_details = f'{sub_details or ""},{sub_details_alt or ""}'.replace(':', ',')
|
||||
sub_details = tuple(filter(None, map(str.strip, sub_details.split(','))))
|
||||
|
||||
issues = [issue.strip()[1:] for issue in issues.split(',')] if issues else []
|
||||
|
||||
if not group:
|
||||
group = CommitGroup.get(prefix.lower())
|
||||
if not group:
|
||||
if self.EXTRACTOR_INDICATOR_RE.search(commit.short):
|
||||
group = CommitGroup.EXTRACTOR
|
||||
else:
|
||||
group = CommitGroup.POSTPROCESSOR
|
||||
logger.warning(f'Failed to map {commit.short!r}, selected {group.name}')
|
||||
|
||||
commit_info = CommitInfo(
|
||||
details, sub_details, message.strip(),
|
||||
issues, commit, self._fixes[commit.hash])
|
||||
logger.debug(f'Resolved {commit.short!r} to {commit_info!r}')
|
||||
groups[group].append(commit_info)
|
||||
|
||||
return groups
|
||||
|
||||
|
||||
def get_new_contributors(contributors_path, commits):
|
||||
contributors = set()
|
||||
if contributors_path.exists():
|
||||
for line in read_file(contributors_path).splitlines():
|
||||
author, _, _ = line.strip().partition(' (')
|
||||
authors = author.split('/')
|
||||
contributors.update(map(str.casefold, authors))
|
||||
|
||||
new_contributors = set()
|
||||
for commit in commits:
|
||||
for author in commit.authors:
|
||||
author_folded = author.casefold()
|
||||
if author_folded not in contributors:
|
||||
contributors.add(author_folded)
|
||||
new_contributors.add(author)
|
||||
|
||||
return sorted(new_contributors, key=str.casefold)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
import argparse
|
||||
|
||||
parser = argparse.ArgumentParser(
|
||||
description='Create a changelog markdown from a git commit range')
|
||||
parser.add_argument(
|
||||
'commitish', default='HEAD', nargs='?',
|
||||
help='The commitish to create the range from (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity', action='count', default=0,
|
||||
help='increase verbosity (can be used twice)')
|
||||
parser.add_argument(
|
||||
'-c', '--contributors', action='store_true',
|
||||
help='update CONTRIBUTORS file (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'--contributors-path', type=Path, default=LOCATION_PATH.parent / 'CONTRIBUTORS',
|
||||
help='path to the CONTRIBUTORS file')
|
||||
parser.add_argument(
|
||||
'--no-override', action='store_true',
|
||||
help='skip override json in commit generation (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'--override-path', type=Path, default=LOCATION_PATH / 'changelog_override.json',
|
||||
help='path to the changelog_override.json file')
|
||||
parser.add_argument(
|
||||
'--default-author', default='pukkandan',
|
||||
help='the author to use without a author indicator (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'--repo', default='yt-dlp/yt-dlp',
|
||||
help='the github repository to use for the operations (default: %(default)s)')
|
||||
args = parser.parse_args()
|
||||
|
||||
logging.basicConfig(
|
||||
datefmt='%Y-%m-%d %H-%M-%S', format='{asctime} | {levelname:<8} | {message}',
|
||||
level=logging.WARNING - 10 * args.verbosity, style='{', stream=sys.stderr)
|
||||
|
||||
commits = CommitRange(None, args.commitish, args.default_author)
|
||||
|
||||
if not args.no_override:
|
||||
if args.override_path.exists():
|
||||
overrides = json.loads(read_file(args.override_path))
|
||||
commits.apply_overrides(overrides)
|
||||
else:
|
||||
logger.warning(f'File {args.override_path.as_posix()} does not exist')
|
||||
|
||||
logger.info(f'Loaded {len(commits)} commits')
|
||||
|
||||
new_contributors = get_new_contributors(args.contributors_path, commits)
|
||||
if new_contributors:
|
||||
if args.contributors:
|
||||
write_file(args.contributors_path, '\n'.join(new_contributors) + '\n', mode='a')
|
||||
logger.info(f'New contributors: {", ".join(new_contributors)}')
|
||||
|
||||
print(Changelog(commits.groups(), args.repo))
|
||||
@@ -1,5 +1,7 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import io
|
||||
import optparse
|
||||
import re
|
||||
|
||||
@@ -14,7 +16,7 @@ def main():
|
||||
|
||||
infile, outfile = args
|
||||
|
||||
with open(infile, encoding='utf-8') as inf:
|
||||
with io.open(infile, encoding='utf-8') as inf:
|
||||
readme = inf.read()
|
||||
|
||||
bug_text = re.search(
|
||||
@@ -24,7 +26,7 @@ def main():
|
||||
|
||||
out = bug_text + dev_text
|
||||
|
||||
with open(outfile, 'w', encoding='utf-8') as outf:
|
||||
with io.open(outfile, 'w', encoding='utf-8') as outf:
|
||||
outf.write(out)
|
||||
|
||||
|
||||
|
||||
@@ -1,78 +1,29 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import re
|
||||
|
||||
from devscripts.utils import (
|
||||
get_filename_args,
|
||||
read_file,
|
||||
read_version,
|
||||
write_file,
|
||||
)
|
||||
|
||||
VERBOSE_TMPL = '''
|
||||
- type: checkboxes
|
||||
id: verbose
|
||||
attributes:
|
||||
label: Provide verbose output that clearly demonstrates the problem
|
||||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
required: true
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
required: true
|
||||
- type: textarea
|
||||
id: log
|
||||
attributes:
|
||||
label: Complete Verbose Output
|
||||
description: |
|
||||
It should start like this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version %(version)s [9d339c4] (win32_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: %(version)s, Current version: %(version)s
|
||||
yt-dlp is up to date (%(version)s)
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
required: true
|
||||
'''.strip()
|
||||
|
||||
NO_SKIP = '''
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\\* field
|
||||
required: true
|
||||
'''.strip()
|
||||
import io
|
||||
import optparse
|
||||
|
||||
|
||||
def main():
|
||||
fields = {'version': read_version(), 'no_skip': NO_SKIP}
|
||||
fields['verbose'] = VERBOSE_TMPL % fields
|
||||
fields['verbose_optional'] = re.sub(r'(\n\s+validations:)?\n\s+required: true', '', fields['verbose'])
|
||||
parser = optparse.OptionParser(usage='%prog INFILE OUTFILE')
|
||||
options, args = parser.parse_args()
|
||||
if len(args) != 2:
|
||||
parser.error('Expected an input and an output filename')
|
||||
|
||||
infile, outfile = get_filename_args(has_infile=True)
|
||||
write_file(outfile, read_file(infile) % fields)
|
||||
infile, outfile = args
|
||||
|
||||
with io.open(infile, encoding='utf-8') as inf:
|
||||
issue_template_tmpl = inf.read()
|
||||
|
||||
# Get the version from yt_dlp/version.py without importing the package
|
||||
exec(compile(open('yt_dlp/version.py').read(),
|
||||
'yt_dlp/version.py', 'exec'))
|
||||
|
||||
out = issue_template_tmpl % {'version': locals()['__version__']}
|
||||
|
||||
with io.open(outfile, 'w', encoding='utf-8') as outf:
|
||||
outf.write(out)
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
|
||||
@@ -1,132 +1,105 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import shutil
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from __future__ import unicode_literals, print_function
|
||||
|
||||
from inspect import getsource
|
||||
import io
|
||||
import os
|
||||
from os.path import dirname as dirn
|
||||
import sys
|
||||
|
||||
from devscripts.utils import get_filename_args, read_file, write_file
|
||||
sys.path.insert(0, dirn(dirn((os.path.abspath(__file__)))))
|
||||
|
||||
NO_ATTR = object()
|
||||
STATIC_CLASS_PROPERTIES = [
|
||||
'IE_NAME', '_ENABLED', '_VALID_URL', # Used for URL matching
|
||||
'_WORKING', 'IE_DESC', '_NETRC_MACHINE', 'SEARCH_KEY', # Used for --extractor-descriptions
|
||||
'age_limit', # Used for --age-limit (evaluated)
|
||||
'_RETURN_TYPE', # Accessed in CLI only with instance (evaluated)
|
||||
]
|
||||
CLASS_METHODS = [
|
||||
'ie_key', 'suitable', '_match_valid_url', # Used for URL matching
|
||||
'working', 'get_temp_id', '_match_id', # Accessed just before instance creation
|
||||
'description', # Used for --extractor-descriptions
|
||||
'is_suitable', # Used for --age-limit
|
||||
'supports_login', 'is_single_video', # Accessed in CLI only with instance
|
||||
]
|
||||
IE_TEMPLATE = '''
|
||||
lazy_extractors_filename = sys.argv[1] if len(sys.argv) > 1 else 'yt_dlp/extractor/lazy_extractors.py'
|
||||
if os.path.exists(lazy_extractors_filename):
|
||||
os.remove(lazy_extractors_filename)
|
||||
|
||||
# Block plugins from loading
|
||||
plugins_dirname = 'ytdlp_plugins'
|
||||
plugins_blocked_dirname = 'ytdlp_plugins_blocked'
|
||||
if os.path.exists(plugins_dirname):
|
||||
os.rename(plugins_dirname, plugins_blocked_dirname)
|
||||
|
||||
from yt_dlp.extractor import _ALL_CLASSES
|
||||
from yt_dlp.extractor.common import InfoExtractor, SearchInfoExtractor
|
||||
|
||||
if os.path.exists(plugins_blocked_dirname):
|
||||
os.rename(plugins_blocked_dirname, plugins_dirname)
|
||||
|
||||
with open('devscripts/lazy_load_template.py', 'rt') as f:
|
||||
module_template = f.read()
|
||||
|
||||
CLASS_PROPERTIES = ['ie_key', 'working', '_match_valid_url', 'suitable', '_match_id', 'get_temp_id']
|
||||
module_contents = [
|
||||
module_template,
|
||||
*[getsource(getattr(InfoExtractor, k)) for k in CLASS_PROPERTIES],
|
||||
'\nclass LazyLoadSearchExtractor(LazyLoadExtractor):\n pass\n']
|
||||
|
||||
ie_template = '''
|
||||
class {name}({bases}):
|
||||
_module = {module!r}
|
||||
_module = '{module}'
|
||||
'''
|
||||
MODULE_TEMPLATE = read_file('devscripts/lazy_load_template.py')
|
||||
|
||||
|
||||
def main():
|
||||
lazy_extractors_filename = get_filename_args(default_outfile='yt_dlp/extractor/lazy_extractors.py')
|
||||
if os.path.exists(lazy_extractors_filename):
|
||||
os.remove(lazy_extractors_filename)
|
||||
|
||||
_ALL_CLASSES = get_all_ies() # Must be before import
|
||||
|
||||
import yt_dlp.plugins
|
||||
from yt_dlp.extractor.common import InfoExtractor, SearchInfoExtractor
|
||||
|
||||
# Filter out plugins
|
||||
_ALL_CLASSES = [cls for cls in _ALL_CLASSES if not cls.__module__.startswith(f'{yt_dlp.plugins.PACKAGE_NAME}.')]
|
||||
|
||||
DummyInfoExtractor = type('InfoExtractor', (InfoExtractor,), {'IE_NAME': NO_ATTR})
|
||||
module_src = '\n'.join((
|
||||
MODULE_TEMPLATE,
|
||||
' _module = None',
|
||||
*extra_ie_code(DummyInfoExtractor),
|
||||
'\nclass LazyLoadSearchExtractor(LazyLoadExtractor):\n pass\n',
|
||||
*build_ies(_ALL_CLASSES, (InfoExtractor, SearchInfoExtractor), DummyInfoExtractor),
|
||||
))
|
||||
|
||||
write_file(lazy_extractors_filename, f'{module_src}\n')
|
||||
def get_base_name(base):
|
||||
if base is InfoExtractor:
|
||||
return 'LazyLoadExtractor'
|
||||
elif base is SearchInfoExtractor:
|
||||
return 'LazyLoadSearchExtractor'
|
||||
else:
|
||||
return base.__name__
|
||||
|
||||
|
||||
def get_all_ies():
|
||||
PLUGINS_DIRNAME = 'ytdlp_plugins'
|
||||
BLOCKED_DIRNAME = f'{PLUGINS_DIRNAME}_blocked'
|
||||
if os.path.exists(PLUGINS_DIRNAME):
|
||||
# os.rename cannot be used, e.g. in Docker. See https://github.com/yt-dlp/yt-dlp/pull/4958
|
||||
shutil.move(PLUGINS_DIRNAME, BLOCKED_DIRNAME)
|
||||
try:
|
||||
from yt_dlp.extractor.extractors import _ALL_CLASSES
|
||||
finally:
|
||||
if os.path.exists(BLOCKED_DIRNAME):
|
||||
shutil.move(BLOCKED_DIRNAME, PLUGINS_DIRNAME)
|
||||
return _ALL_CLASSES
|
||||
def build_lazy_ie(ie, name):
|
||||
s = ie_template.format(
|
||||
name=name,
|
||||
bases=', '.join(map(get_base_name, ie.__bases__)),
|
||||
module=ie.__module__)
|
||||
valid_url = getattr(ie, '_VALID_URL', None)
|
||||
if not valid_url and hasattr(ie, '_make_valid_url'):
|
||||
valid_url = ie._make_valid_url()
|
||||
if valid_url:
|
||||
s += f' _VALID_URL = {valid_url!r}\n'
|
||||
if not ie._WORKING:
|
||||
s += ' _WORKING = False\n'
|
||||
if ie.suitable.__func__ is not InfoExtractor.suitable.__func__:
|
||||
s += f'\n{getsource(ie.suitable)}'
|
||||
return s
|
||||
|
||||
|
||||
def extra_ie_code(ie, base=None):
|
||||
for var in STATIC_CLASS_PROPERTIES:
|
||||
val = getattr(ie, var)
|
||||
if val != (getattr(base, var) if base else NO_ATTR):
|
||||
yield f' {var} = {val!r}'
|
||||
yield ''
|
||||
# find the correct sorting and add the required base classes so that subclasses
|
||||
# can be correctly created
|
||||
classes = _ALL_CLASSES[:-1]
|
||||
ordered_cls = []
|
||||
while classes:
|
||||
for c in classes[:]:
|
||||
bases = set(c.__bases__) - set((object, InfoExtractor, SearchInfoExtractor))
|
||||
stop = False
|
||||
for b in bases:
|
||||
if b not in classes and b not in ordered_cls:
|
||||
if b.__name__ == 'GenericIE':
|
||||
exit()
|
||||
classes.insert(0, b)
|
||||
stop = True
|
||||
if stop:
|
||||
break
|
||||
if all(b in ordered_cls for b in bases):
|
||||
ordered_cls.append(c)
|
||||
classes.remove(c)
|
||||
break
|
||||
ordered_cls.append(_ALL_CLASSES[-1])
|
||||
|
||||
for name in CLASS_METHODS:
|
||||
f = getattr(ie, name)
|
||||
if not base or f.__func__ != getattr(base, name).__func__:
|
||||
yield getsource(f)
|
||||
names = []
|
||||
for ie in ordered_cls:
|
||||
name = ie.__name__
|
||||
src = build_lazy_ie(ie, name)
|
||||
module_contents.append(src)
|
||||
if ie in _ALL_CLASSES:
|
||||
names.append(name)
|
||||
|
||||
module_contents.append(
|
||||
'\n_ALL_CLASSES = [{0}]'.format(', '.join(names)))
|
||||
|
||||
def build_ies(ies, bases, attr_base):
|
||||
names = []
|
||||
for ie in sort_ies(ies, bases):
|
||||
yield build_lazy_ie(ie, ie.__name__, attr_base)
|
||||
if ie in ies:
|
||||
names.append(ie.__name__)
|
||||
module_src = '\n'.join(module_contents) + '\n'
|
||||
|
||||
yield f'\n_ALL_CLASSES = [{", ".join(names)}]'
|
||||
|
||||
|
||||
def sort_ies(ies, ignored_bases):
|
||||
"""find the correct sorting and add the required base classes so that subclasses can be correctly created"""
|
||||
classes, returned_classes = ies[:-1], set()
|
||||
assert ies[-1].__name__ == 'GenericIE', 'Last IE must be GenericIE'
|
||||
while classes:
|
||||
for c in classes[:]:
|
||||
bases = set(c.__bases__) - {object, *ignored_bases}
|
||||
restart = False
|
||||
for b in sorted(bases, key=lambda x: x.__name__):
|
||||
if b not in classes and b not in returned_classes:
|
||||
assert b.__name__ != 'GenericIE', 'Cannot inherit from GenericIE'
|
||||
classes.insert(0, b)
|
||||
restart = True
|
||||
if restart:
|
||||
break
|
||||
if bases <= returned_classes:
|
||||
yield c
|
||||
returned_classes.add(c)
|
||||
classes.remove(c)
|
||||
break
|
||||
yield ies[-1]
|
||||
|
||||
|
||||
def build_lazy_ie(ie, name, attr_base):
|
||||
bases = ', '.join({
|
||||
'InfoExtractor': 'LazyLoadExtractor',
|
||||
'SearchInfoExtractor': 'LazyLoadSearchExtractor',
|
||||
}.get(base.__name__, base.__name__) for base in ie.__bases__)
|
||||
|
||||
s = IE_TEMPLATE.format(name=name, module=ie.__module__, bases=bases)
|
||||
return s + '\n'.join(extra_ie_code(ie, attr_base))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
with io.open(lazy_extractors_filename, 'wt', encoding='utf-8') as f:
|
||||
f.write(module_src)
|
||||
|
||||
@@ -1,93 +1,31 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
"""
|
||||
yt-dlp --help | make_readme.py
|
||||
This must be run in a console of correct width
|
||||
"""
|
||||
# yt-dlp --help | make_readme.py
|
||||
# This must be run in a console of correct width
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import io
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import functools
|
||||
import re
|
||||
|
||||
from devscripts.utils import read_file, write_file
|
||||
|
||||
README_FILE = 'README.md'
|
||||
helptext = sys.stdin.read()
|
||||
|
||||
OPTIONS_START = 'General Options:'
|
||||
OPTIONS_END = 'CONFIGURATION'
|
||||
EPILOG_START = 'See full documentation'
|
||||
ALLOWED_OVERSHOOT = 2
|
||||
if isinstance(helptext, bytes):
|
||||
helptext = helptext.decode('utf-8')
|
||||
|
||||
DISABLE_PATCH = object()
|
||||
with io.open(README_FILE, encoding='utf-8') as f:
|
||||
oldreadme = f.read()
|
||||
|
||||
header = oldreadme[:oldreadme.index('## General Options:')]
|
||||
footer = oldreadme[oldreadme.index('# CONFIGURATION'):]
|
||||
|
||||
def take_section(text, start=None, end=None, *, shift=0):
|
||||
return text[
|
||||
text.index(start) + shift if start else None:
|
||||
text.index(end) + shift if end else None
|
||||
]
|
||||
options = helptext[helptext.index(' General Options:'):]
|
||||
options = re.sub(r'(?m)^ (\w.+)$', r'## \1', options)
|
||||
options = options + '\n'
|
||||
|
||||
|
||||
def apply_patch(text, patch):
|
||||
return text if patch[0] is DISABLE_PATCH else re.sub(*patch, text)
|
||||
|
||||
|
||||
options = take_section(sys.stdin.read(), f'\n {OPTIONS_START}', f'\n{EPILOG_START}', shift=1)
|
||||
|
||||
max_width = max(map(len, options.split('\n')))
|
||||
switch_col_width = len(re.search(r'(?m)^\s{5,}', options).group())
|
||||
delim = f'\n{" " * switch_col_width}'
|
||||
|
||||
PATCHES = (
|
||||
( # Standardize `--update` message
|
||||
r'(?m)^( -U, --update\s+).+(\n \s.+)*$',
|
||||
r'\1Update this program to the latest version',
|
||||
),
|
||||
( # Headings
|
||||
r'(?m)^ (\w.+\n)( (?=\w))?',
|
||||
r'## \1'
|
||||
),
|
||||
( # Fixup `--date` formatting
|
||||
rf'(?m)( --date DATE.+({delim}[^\[]+)*)\[.+({delim}.+)*$',
|
||||
(rf'\1[now|today|yesterday][-N[day|week|month|year]].{delim}'
|
||||
f'E.g. "--date today-2weeks" downloads only{delim}'
|
||||
'videos uploaded on the same day two weeks ago'),
|
||||
),
|
||||
( # Do not split URLs
|
||||
rf'({delim[:-1]})? (?P<label>\[\S+\] )?(?P<url>https?({delim})?:({delim})?/({delim})?/(({delim})?\S+)+)\s',
|
||||
lambda mobj: ''.join((delim, mobj.group('label') or '', re.sub(r'\s+', '', mobj.group('url')), '\n'))
|
||||
),
|
||||
( # Do not split "words"
|
||||
rf'(?m)({delim}\S+)+$',
|
||||
lambda mobj: ''.join((delim, mobj.group(0).replace(delim, '')))
|
||||
),
|
||||
( # Allow overshooting last line
|
||||
rf'(?m)^(?P<prev>.+)${delim}(?P<current>.+)$(?!{delim})',
|
||||
lambda mobj: (mobj.group().replace(delim, ' ')
|
||||
if len(mobj.group()) - len(delim) + 1 <= max_width + ALLOWED_OVERSHOOT
|
||||
else mobj.group())
|
||||
),
|
||||
( # Avoid newline when a space is available b/w switch and description
|
||||
DISABLE_PATCH, # This creates issues with prepare_manpage
|
||||
r'(?m)^(\s{4}-.{%d})(%s)' % (switch_col_width - 6, delim),
|
||||
r'\1 '
|
||||
),
|
||||
( # Replace brackets with a Markdown link
|
||||
r'SponsorBlock API \((http.+)\)',
|
||||
r'[SponsorBlock API](\1)'
|
||||
),
|
||||
)
|
||||
|
||||
readme = read_file(README_FILE)
|
||||
|
||||
write_file(README_FILE, ''.join((
|
||||
take_section(readme, end=f'## {OPTIONS_START}'),
|
||||
functools.reduce(apply_patch, PATCHES, options),
|
||||
take_section(readme, f'# {OPTIONS_END}'),
|
||||
)))
|
||||
with io.open(README_FILE, 'w', encoding='utf-8') as f:
|
||||
f.write(header)
|
||||
f.write(options)
|
||||
f.write(footer)
|
||||
|
||||
@@ -1,19 +1,48 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
# Allow direct execution
|
||||
import io
|
||||
import optparse
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
from devscripts.utils import get_filename_args, write_file
|
||||
from yt_dlp.extractor import list_extractor_classes
|
||||
# Import yt_dlp
|
||||
ROOT_DIR = os.path.join(os.path.dirname(__file__), '..')
|
||||
sys.path.insert(0, ROOT_DIR)
|
||||
import yt_dlp
|
||||
|
||||
|
||||
def main():
|
||||
out = '\n'.join(ie.description() for ie in list_extractor_classes() if ie.IE_DESC is not False)
|
||||
write_file(get_filename_args(), f'# Supported sites\n{out}\n')
|
||||
parser = optparse.OptionParser(usage='%prog OUTFILE.md')
|
||||
options, args = parser.parse_args()
|
||||
if len(args) != 1:
|
||||
parser.error('Expected an output filename')
|
||||
|
||||
outfile, = args
|
||||
|
||||
def gen_ies_md(ies):
|
||||
for ie in ies:
|
||||
ie_md = '**{0}**'.format(ie.IE_NAME)
|
||||
ie_desc = getattr(ie, 'IE_DESC', None)
|
||||
if ie_desc is False:
|
||||
continue
|
||||
if ie_desc is not None:
|
||||
ie_md += ': {0}'.format(ie.IE_DESC)
|
||||
search_key = getattr(ie, 'SEARCH_KEY', None)
|
||||
if search_key is not None:
|
||||
ie_md += f'; "{ie.SEARCH_KEY}:" prefix'
|
||||
if not ie.working():
|
||||
ie_md += ' (Currently broken)'
|
||||
yield ie_md
|
||||
|
||||
ies = sorted(yt_dlp.gen_extractors(), key=lambda i: i.IE_NAME.lower())
|
||||
out = '# Supported sites\n' + ''.join(
|
||||
' - ' + md + '\n'
|
||||
for md in gen_ies_md(ies))
|
||||
|
||||
with io.open(outfile, 'w', encoding='utf-8') as outf:
|
||||
outf.write(out)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
6
devscripts/posix-locale.sh
Executable file
6
devscripts/posix-locale.sh
Executable file
@@ -0,0 +1,6 @@
|
||||
|
||||
# source this file in your shell to get a POSIX locale (which will break many programs, but that's kind of the point)
|
||||
|
||||
export LC_ALL=POSIX
|
||||
export LANG=POSIX
|
||||
export LANGUAGE=POSIX
|
||||
@@ -1,22 +1,11 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import io
|
||||
import optparse
|
||||
import os.path
|
||||
import re
|
||||
|
||||
from devscripts.utils import (
|
||||
compose_functions,
|
||||
get_filename_args,
|
||||
read_file,
|
||||
write_file,
|
||||
)
|
||||
|
||||
ROOT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
README_FILE = os.path.join(ROOT_DIR, 'README.md')
|
||||
|
||||
@@ -35,6 +24,25 @@
|
||||
'''
|
||||
|
||||
|
||||
def main():
|
||||
parser = optparse.OptionParser(usage='%prog OUTFILE.md')
|
||||
options, args = parser.parse_args()
|
||||
if len(args) != 1:
|
||||
parser.error('Expected an output filename')
|
||||
|
||||
outfile, = args
|
||||
|
||||
with io.open(README_FILE, encoding='utf-8') as f:
|
||||
readme = f.read()
|
||||
|
||||
readme = filter_excluded_sections(readme)
|
||||
readme = move_sections(readme)
|
||||
readme = filter_options(readme)
|
||||
|
||||
with io.open(outfile, 'w', encoding='utf-8') as outf:
|
||||
outf.write(PREFIX + readme)
|
||||
|
||||
|
||||
def filter_excluded_sections(readme):
|
||||
EXCLUDED_SECTION_BEGIN_STRING = re.escape('<!-- MANPAGE: BEGIN EXCLUDED SECTION -->')
|
||||
EXCLUDED_SECTION_END_STRING = re.escape('<!-- MANPAGE: END EXCLUDED SECTION -->')
|
||||
@@ -67,31 +75,24 @@ def filter_options(readme):
|
||||
section = re.search(r'(?sm)^# USAGE AND OPTIONS\n.+?(?=^# )', readme).group(0)
|
||||
options = '# OPTIONS\n'
|
||||
for line in section.split('\n')[1:]:
|
||||
mobj = re.fullmatch(r'''(?x)
|
||||
\s{4}(?P<opt>-(?:,\s|[^\s])+)
|
||||
(?:\s(?P<meta>(?:[^\s]|\s(?!\s))+))?
|
||||
(\s{2,}(?P<desc>.+))?
|
||||
''', line)
|
||||
if not mobj:
|
||||
options += f'{line.lstrip()}\n'
|
||||
continue
|
||||
option, metavar, description = mobj.group('opt', 'meta', 'desc')
|
||||
if line.lstrip().startswith('-'):
|
||||
split = re.split(r'\s{2,}', line.lstrip())
|
||||
# Description string may start with `-` as well. If there is
|
||||
# only one piece then it's a description bit not an option.
|
||||
if len(split) > 1:
|
||||
option, description = split
|
||||
split_option = option.split(' ')
|
||||
|
||||
# Pandoc's definition_lists. See http://pandoc.org/README.html
|
||||
option = f'{option} *{metavar}*' if metavar else option
|
||||
description = f'{description}\n' if description else ''
|
||||
options += f'\n{option}\n: {description}'
|
||||
continue
|
||||
if not split_option[-1].startswith('-'): # metavar
|
||||
option = ' '.join(split_option[:-1] + [f'*{split_option[-1]}*'])
|
||||
|
||||
# Pandoc's definition_lists. See http://pandoc.org/README.html
|
||||
options += f'\n{option}\n: {description}\n'
|
||||
continue
|
||||
options += line.lstrip() + '\n'
|
||||
|
||||
return readme.replace(section, options, 1)
|
||||
|
||||
|
||||
TRANSFORM = compose_functions(filter_excluded_sections, move_sections, filter_options)
|
||||
|
||||
|
||||
def main():
|
||||
write_file(get_filename_args(), PREFIX + TRANSFORM(read_file(README_FILE)))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
||||
|
||||
143
devscripts/release.sh
Executable file
143
devscripts/release.sh
Executable file
@@ -0,0 +1,143 @@
|
||||
# Unused
|
||||
|
||||
#!/bin/bash
|
||||
|
||||
# IMPORTANT: the following assumptions are made
|
||||
# * the GH repo is on the origin remote
|
||||
# * the gh-pages branch is named so locally
|
||||
# * the git config user.signingkey is properly set
|
||||
|
||||
# You will need
|
||||
# pip install coverage nose rsa wheel
|
||||
|
||||
# TODO
|
||||
# release notes
|
||||
# make hash on local files
|
||||
|
||||
set -e
|
||||
|
||||
skip_tests=true
|
||||
gpg_sign_commits=""
|
||||
buildserver='localhost:8142'
|
||||
|
||||
while true
|
||||
do
|
||||
case "$1" in
|
||||
--run-tests)
|
||||
skip_tests=false
|
||||
shift
|
||||
;;
|
||||
--gpg-sign-commits|-S)
|
||||
gpg_sign_commits="-S"
|
||||
shift
|
||||
;;
|
||||
--buildserver)
|
||||
buildserver="$2"
|
||||
shift 2
|
||||
;;
|
||||
--*)
|
||||
echo "ERROR: unknown option $1"
|
||||
exit 1
|
||||
;;
|
||||
*)
|
||||
break
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
if [ -z "$1" ]; then echo "ERROR: specify version number like this: $0 1994.09.06"; exit 1; fi
|
||||
version="$1"
|
||||
major_version=$(echo "$version" | sed -n 's#^\([0-9]*\.[0-9]*\.[0-9]*\).*#\1#p')
|
||||
if test "$major_version" '!=' "$(date '+%Y.%m.%d')"; then
|
||||
echo "$version does not start with today's date!"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ ! -z "`git tag | grep "$version"`" ]; then echo 'ERROR: version already present'; exit 1; fi
|
||||
if [ ! -z "`git status --porcelain | grep -v CHANGELOG`" ]; then echo 'ERROR: the working directory is not clean; commit or stash changes'; exit 1; fi
|
||||
useless_files=$(find yt_dlp -type f -not -name '*.py')
|
||||
if [ ! -z "$useless_files" ]; then echo "ERROR: Non-.py files in yt_dlp: $useless_files"; exit 1; fi
|
||||
if [ ! -f "updates_key.pem" ]; then echo 'ERROR: updates_key.pem missing'; exit 1; fi
|
||||
if ! type pandoc >/dev/null 2>/dev/null; then echo 'ERROR: pandoc is missing'; exit 1; fi
|
||||
if ! python3 -c 'import rsa' 2>/dev/null; then echo 'ERROR: python3-rsa is missing'; exit 1; fi
|
||||
if ! python3 -c 'import wheel' 2>/dev/null; then echo 'ERROR: wheel is missing'; exit 1; fi
|
||||
|
||||
read -p "Is Changelog up to date? (y/n) " -n 1
|
||||
if [[ ! $REPLY =~ ^[Yy]$ ]]; then exit 1; fi
|
||||
|
||||
/bin/echo -e "\n### First of all, testing..."
|
||||
make clean
|
||||
if $skip_tests ; then
|
||||
echo 'SKIPPING TESTS'
|
||||
else
|
||||
nosetests --verbose --with-coverage --cover-package=yt_dlp --cover-html test --stop || exit 1
|
||||
fi
|
||||
|
||||
/bin/echo -e "\n### Changing version in version.py..."
|
||||
sed -i "s/__version__ = '.*'/__version__ = '$version'/" yt_dlp/version.py
|
||||
|
||||
/bin/echo -e "\n### Changing version in Changelog..."
|
||||
sed -i "s/<unreleased>/$version/" Changelog.md
|
||||
|
||||
/bin/echo -e "\n### Committing documentation, templates and yt_dlp/version.py..."
|
||||
make README.md CONTRIBUTING.md issuetemplates supportedsites
|
||||
git add README.md CONTRIBUTING.md .github/ISSUE_TEMPLATE/1_broken_site.md .github/ISSUE_TEMPLATE/2_site_support_request.md .github/ISSUE_TEMPLATE/3_site_feature_request.md .github/ISSUE_TEMPLATE/4_bug_report.md .github/ISSUE_TEMPLATE/5_feature_request.md .github/ISSUE_TEMPLATE/6_question.md docs/supportedsites.md yt_dlp/version.py Changelog.md
|
||||
git commit $gpg_sign_commits -m "release $version"
|
||||
|
||||
/bin/echo -e "\n### Now tagging, signing and pushing..."
|
||||
git tag -s -m "Release $version" "$version"
|
||||
git show "$version"
|
||||
read -p "Is it good, can I push? (y/n) " -n 1
|
||||
if [[ ! $REPLY =~ ^[Yy]$ ]]; then exit 1; fi
|
||||
echo
|
||||
MASTER=$(git rev-parse --abbrev-ref HEAD)
|
||||
git push origin $MASTER:master
|
||||
git push origin "$version"
|
||||
|
||||
/bin/echo -e "\n### OK, now it is time to build the binaries..."
|
||||
REV=$(git rev-parse HEAD)
|
||||
make yt-dlp yt-dlp.tar.gz
|
||||
read -p "VM running? (y/n) " -n 1
|
||||
wget "http://$buildserver/build/ytdl-org/youtube-dl/yt-dlp.exe?rev=$REV" -O yt-dlp.exe
|
||||
mkdir -p "build/$version"
|
||||
mv yt-dlp yt-dlp.exe "build/$version"
|
||||
mv yt-dlp.tar.gz "build/$version/yt-dlp-$version.tar.gz"
|
||||
RELEASE_FILES="yt-dlp yt-dlp.exe yt-dlp-$version.tar.gz"
|
||||
(cd build/$version/ && md5sum $RELEASE_FILES > MD5SUMS)
|
||||
(cd build/$version/ && sha1sum $RELEASE_FILES > SHA1SUMS)
|
||||
(cd build/$version/ && sha256sum $RELEASE_FILES > SHA2-256SUMS)
|
||||
(cd build/$version/ && sha512sum $RELEASE_FILES > SHA2-512SUMS)
|
||||
|
||||
/bin/echo -e "\n### Signing and uploading the new binaries to GitHub..."
|
||||
for f in $RELEASE_FILES; do gpg --passphrase-repeat 5 --detach-sig "build/$version/$f"; done
|
||||
|
||||
ROOT=$(pwd)
|
||||
python devscripts/create-github-release.py Changelog.md $version "$ROOT/build/$version"
|
||||
|
||||
ssh ytdl@yt-dl.org "sh html/update_latest.sh $version"
|
||||
|
||||
/bin/echo -e "\n### Now switching to gh-pages..."
|
||||
git clone --branch gh-pages --single-branch . build/gh-pages
|
||||
(
|
||||
set -e
|
||||
ORIGIN_URL=$(git config --get remote.origin.url)
|
||||
cd build/gh-pages
|
||||
"$ROOT/devscripts/gh-pages/add-version.py" $version
|
||||
"$ROOT/devscripts/gh-pages/update-feed.py"
|
||||
"$ROOT/devscripts/gh-pages/sign-versions.py" < "$ROOT/updates_key.pem"
|
||||
"$ROOT/devscripts/gh-pages/generate-download.py"
|
||||
"$ROOT/devscripts/gh-pages/update-copyright.py"
|
||||
"$ROOT/devscripts/gh-pages/update-sites.py"
|
||||
git add *.html *.html.in update
|
||||
git commit $gpg_sign_commits -m "release $version"
|
||||
git push "$ROOT" gh-pages
|
||||
git push "$ORIGIN_URL" gh-pages
|
||||
)
|
||||
rm -rf build
|
||||
|
||||
make pypi-files
|
||||
echo "Uploading to PyPi ..."
|
||||
python setup.py sdist bdist_wheel upload
|
||||
make clean
|
||||
|
||||
/bin/echo -e "\n### DONE!"
|
||||
@@ -13,5 +13,4 @@ if ["%~1"]==[""] (
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
set PYTHONWARNINGS=error
|
||||
pytest %test_set%
|
||||
|
||||
@@ -1,14 +1,14 @@
|
||||
#!/usr/bin/env sh
|
||||
#!/bin/sh
|
||||
|
||||
if [ -z "$1" ]; then
|
||||
if [ -z $1 ]; then
|
||||
test_set='test'
|
||||
elif [ "$1" = 'core' ]; then
|
||||
elif [ $1 = 'core' ]; then
|
||||
test_set="-m not download"
|
||||
elif [ "$1" = 'download' ]; then
|
||||
elif [ $1 = 'download' ]; then
|
||||
test_set="-m download"
|
||||
else
|
||||
echo 'Invalid test type "'"$1"'". Use "core" | "download"'
|
||||
echo 'Invalid test type "'$1'". Use "core" | "download"'
|
||||
exit 1
|
||||
fi
|
||||
|
||||
python3 -bb -Werror -m pytest "$test_set"
|
||||
python3 -m pytest "$test_set"
|
||||
|
||||
@@ -1,36 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import argparse
|
||||
import functools
|
||||
import re
|
||||
|
||||
from devscripts.utils import compose_functions, read_file, write_file
|
||||
|
||||
VERSION_FILE = 'yt_dlp/version.py'
|
||||
|
||||
|
||||
def parse_options():
|
||||
parser = argparse.ArgumentParser(description='Set the build variant of the package')
|
||||
parser.add_argument('variant', help='Name of the variant')
|
||||
parser.add_argument('-M', '--update-message', default=None, help='Message to show in -U')
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def property_setter(name, value):
|
||||
return functools.partial(re.sub, rf'(?m)^{name}\s*=\s*.+$', f'{name} = {value!r}')
|
||||
|
||||
|
||||
opts = parse_options()
|
||||
transform = compose_functions(
|
||||
property_setter('VARIANT', opts.variant),
|
||||
property_setter('UPDATE_HINT', opts.update_message)
|
||||
)
|
||||
|
||||
write_file(VERSION_FILE, transform(read_file(VERSION_FILE)))
|
||||
49
devscripts/show-downloads-statistics.py
Normal file
49
devscripts/show-downloads-statistics.py
Normal file
@@ -0,0 +1,49 @@
|
||||
# Unused
|
||||
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import itertools
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from yt_dlp.compat import (
|
||||
compat_print,
|
||||
compat_urllib_request,
|
||||
)
|
||||
from yt_dlp.utils import format_bytes
|
||||
|
||||
|
||||
def format_size(bytes):
|
||||
return '%s (%d bytes)' % (format_bytes(bytes), bytes)
|
||||
|
||||
|
||||
total_bytes = 0
|
||||
|
||||
for page in itertools.count(1):
|
||||
releases = json.loads(compat_urllib_request.urlopen(
|
||||
'https://api.github.com/repos/ytdl-org/youtube-dl/releases?page=%s' % page
|
||||
).read().decode('utf-8'))
|
||||
|
||||
if not releases:
|
||||
break
|
||||
|
||||
for release in releases:
|
||||
compat_print(release['name'])
|
||||
for asset in release['assets']:
|
||||
asset_name = asset['name']
|
||||
total_bytes += asset['download_count'] * asset['size']
|
||||
if all(not re.match(p, asset_name) for p in (
|
||||
r'^yt-dlp$',
|
||||
r'^yt-dlp-\d{4}\.\d{2}\.\d{2}(?:\.\d+)?\.tar\.gz$',
|
||||
r'^yt-dlp\.exe$')):
|
||||
continue
|
||||
compat_print(
|
||||
' %s size: %s downloads: %d'
|
||||
% (asset_name, format_size(asset['size']), asset['download_count']))
|
||||
|
||||
compat_print('total downloads traffic: %s' % format_size(total_bytes))
|
||||
@@ -1,39 +1,37 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
"""
|
||||
Usage: python3 ./devscripts/update-formulae.py <path-to-formulae-rb> <version>
|
||||
version can be either 0-aligned (yt-dlp version) or normalized (PyPi version)
|
||||
"""
|
||||
|
||||
# Allow direct execution
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from yt_dlp.compat import compat_urllib_request
|
||||
|
||||
import json
|
||||
import re
|
||||
import urllib.request
|
||||
|
||||
from devscripts.utils import read_file, write_file
|
||||
# usage: python3 ./devscripts/update-formulae.py <path-to-formulae-rb> <version>
|
||||
# version can be either 0-aligned (yt-dlp version) or normalized (PyPl version)
|
||||
|
||||
filename, version = sys.argv[1:]
|
||||
|
||||
normalized_version = '.'.join(str(int(x)) for x in version.split('.'))
|
||||
|
||||
pypi_release = json.loads(urllib.request.urlopen(
|
||||
pypi_release = json.loads(compat_urllib_request.urlopen(
|
||||
'https://pypi.org/pypi/yt-dlp/%s/json' % normalized_version
|
||||
).read().decode())
|
||||
).read().decode('utf-8'))
|
||||
|
||||
tarball_file = next(x for x in pypi_release['urls'] if x['filename'].endswith('.tar.gz'))
|
||||
|
||||
sha256sum = tarball_file['digests']['sha256']
|
||||
url = tarball_file['url']
|
||||
|
||||
formulae_text = read_file(filename)
|
||||
with open(filename, 'r') as r:
|
||||
formulae_text = r.read()
|
||||
|
||||
formulae_text = re.sub(r'sha256 "[0-9a-f]*?"', 'sha256 "%s"' % sha256sum, formulae_text, count=1)
|
||||
formulae_text = re.sub(r'url "[^"]*?"', 'url "%s"' % url, formulae_text, count=1)
|
||||
formulae_text = re.sub(r'sha256 "[0-9a-f]*?"', 'sha256 "%s"' % sha256sum, formulae_text)
|
||||
formulae_text = re.sub(r'url "[^"]*?"', 'url "%s"' % url, formulae_text)
|
||||
|
||||
write_file(filename, formulae_text)
|
||||
with open(filename, 'w') as w:
|
||||
w.write(formulae_text)
|
||||
|
||||
@@ -1,71 +1,42 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import argparse
|
||||
import contextlib
|
||||
import sys
|
||||
from datetime import datetime
|
||||
|
||||
from devscripts.utils import read_version, run_process, write_file
|
||||
import sys
|
||||
import subprocess
|
||||
|
||||
|
||||
def get_new_version(version, revision):
|
||||
if not version:
|
||||
version = datetime.utcnow().strftime('%Y.%m.%d')
|
||||
with open('yt_dlp/version.py', 'rt') as f:
|
||||
exec(compile(f.read(), 'yt_dlp/version.py', 'exec'))
|
||||
old_version = locals()['__version__']
|
||||
|
||||
if revision:
|
||||
assert revision.isdigit(), 'Revision must be a number'
|
||||
else:
|
||||
old_version = read_version().split('.')
|
||||
if version.split('.') == old_version[:3]:
|
||||
revision = str(int((old_version + [0])[3]) + 1)
|
||||
old_version_list = old_version.split('.')
|
||||
|
||||
return f'{version}.{revision}' if revision else version
|
||||
old_ver = '.'.join(old_version_list[:3])
|
||||
old_rev = old_version_list[3] if len(old_version_list) > 3 else ''
|
||||
|
||||
ver = datetime.utcnow().strftime("%Y.%m.%d")
|
||||
|
||||
def get_git_head():
|
||||
with contextlib.suppress(Exception):
|
||||
return run_process('git', 'rev-parse', 'HEAD').stdout.strip()
|
||||
rev = (sys.argv[1:] or [''])[0] # Use first argument, if present as revision number
|
||||
if not rev:
|
||||
rev = str(int(old_rev or 0) + 1) if old_ver == ver else ''
|
||||
|
||||
VERSION = '.'.join((ver, rev)) if rev else ver
|
||||
|
||||
VERSION_TEMPLATE = '''\
|
||||
try:
|
||||
sp = subprocess.Popen(['git', 'rev-parse', '--short', 'HEAD'], stdout=subprocess.PIPE)
|
||||
GIT_HEAD = sp.communicate()[0].decode().strip() or None
|
||||
except Exception:
|
||||
GIT_HEAD = None
|
||||
|
||||
VERSION_FILE = f'''\
|
||||
# Autogenerated by devscripts/update-version.py
|
||||
|
||||
__version__ = {version!r}
|
||||
__version__ = {VERSION!r}
|
||||
|
||||
RELEASE_GIT_HEAD = {git_head!r}
|
||||
|
||||
VARIANT = None
|
||||
|
||||
UPDATE_HINT = None
|
||||
|
||||
CHANNEL = {channel!r}
|
||||
RELEASE_GIT_HEAD = {GIT_HEAD!r}
|
||||
'''
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='Update the version.py file')
|
||||
parser.add_argument(
|
||||
'-c', '--channel', choices=['stable', 'nightly'], default='stable',
|
||||
help='Select update channel (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'-o', '--output', default='yt_dlp/version.py',
|
||||
help='The output file to write to (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'version', nargs='?', default=None,
|
||||
help='A version or revision to use instead of generating one')
|
||||
args = parser.parse_args()
|
||||
with open('yt_dlp/version.py', 'wt') as f:
|
||||
f.write(VERSION_FILE)
|
||||
|
||||
git_head = get_git_head()
|
||||
version = (
|
||||
args.version if args.version and '.' in args.version
|
||||
else get_new_version(None, args.version))
|
||||
write_file(args.output, VERSION_TEMPLATE.format(
|
||||
version=version, git_head=git_head, channel=args.channel))
|
||||
|
||||
print(f'version={version} ({args.channel}), head={git_head}')
|
||||
print('::set-output name=ytdlp_version::' + VERSION)
|
||||
print(f'\nVersion = {VERSION}, Git HEAD = {GIT_HEAD}')
|
||||
|
||||
@@ -1,46 +0,0 @@
|
||||
import argparse
|
||||
import functools
|
||||
import subprocess
|
||||
|
||||
|
||||
def read_file(fname):
|
||||
with open(fname, encoding='utf-8') as f:
|
||||
return f.read()
|
||||
|
||||
|
||||
def write_file(fname, content, mode='w'):
|
||||
with open(fname, mode, encoding='utf-8') as f:
|
||||
return f.write(content)
|
||||
|
||||
|
||||
def read_version(fname='yt_dlp/version.py'):
|
||||
"""Get the version without importing the package"""
|
||||
exec(compile(read_file(fname), fname, 'exec'))
|
||||
return locals()['__version__']
|
||||
|
||||
|
||||
def get_filename_args(has_infile=False, default_outfile=None):
|
||||
parser = argparse.ArgumentParser()
|
||||
if has_infile:
|
||||
parser.add_argument('infile', help='Input file')
|
||||
kwargs = {'nargs': '?', 'default': default_outfile} if default_outfile else {}
|
||||
parser.add_argument('outfile', **kwargs, help='Output file')
|
||||
|
||||
opts = parser.parse_args()
|
||||
if has_infile:
|
||||
return opts.infile, opts.outfile
|
||||
return opts.outfile
|
||||
|
||||
|
||||
def compose_functions(*functions):
|
||||
return lambda x: functools.reduce(lambda y, f: f(y), functions, x)
|
||||
|
||||
|
||||
def run_process(*args, **kwargs):
|
||||
kwargs.setdefault('text', True)
|
||||
kwargs.setdefault('check', True)
|
||||
kwargs.setdefault('capture_output', True)
|
||||
if kwargs['text']:
|
||||
kwargs.setdefault('encoding', 'utf-8')
|
||||
kwargs.setdefault('errors', 'replace')
|
||||
return subprocess.run(args, **kwargs)
|
||||
58
devscripts/wine-py2exe.sh
Executable file
58
devscripts/wine-py2exe.sh
Executable file
@@ -0,0 +1,58 @@
|
||||
# UNUSED
|
||||
|
||||
#!/bin/bash
|
||||
|
||||
# Run with as parameter a setup.py that works in the current directory
|
||||
# e.g. no os.chdir()
|
||||
# It will run twice, the first time will crash
|
||||
|
||||
set -e
|
||||
|
||||
SCRIPT_DIR="$( cd "$( dirname "$0" )" && pwd )"
|
||||
|
||||
if [ ! -d wine-py2exe ]; then
|
||||
|
||||
sudo apt-get install wine1.3 axel bsdiff
|
||||
|
||||
mkdir wine-py2exe
|
||||
cd wine-py2exe
|
||||
export WINEPREFIX=`pwd`
|
||||
|
||||
axel -a "http://www.python.org/ftp/python/2.7/python-2.7.msi"
|
||||
axel -a "http://downloads.sourceforge.net/project/py2exe/py2exe/0.6.9/py2exe-0.6.9.win32-py2.7.exe"
|
||||
#axel -a "http://winetricks.org/winetricks"
|
||||
|
||||
# http://appdb.winehq.org/objectManager.php?sClass=version&iId=21957
|
||||
echo "Follow python setup on screen"
|
||||
wine msiexec /i python-2.7.msi
|
||||
|
||||
echo "Follow py2exe setup on screen"
|
||||
wine py2exe-0.6.9.win32-py2.7.exe
|
||||
|
||||
#echo "Follow Microsoft Visual C++ 2008 Redistributable Package setup on screen"
|
||||
#bash winetricks vcrun2008
|
||||
|
||||
rm py2exe-0.6.9.win32-py2.7.exe
|
||||
rm python-2.7.msi
|
||||
#rm winetricks
|
||||
|
||||
# http://bugs.winehq.org/show_bug.cgi?id=3591
|
||||
|
||||
mv drive_c/Python27/Lib/site-packages/py2exe/run.exe drive_c/Python27/Lib/site-packages/py2exe/run.exe.backup
|
||||
bspatch drive_c/Python27/Lib/site-packages/py2exe/run.exe.backup drive_c/Python27/Lib/site-packages/py2exe/run.exe "$SCRIPT_DIR/SizeOfImage.patch"
|
||||
mv drive_c/Python27/Lib/site-packages/py2exe/run_w.exe drive_c/Python27/Lib/site-packages/py2exe/run_w.exe.backup
|
||||
bspatch drive_c/Python27/Lib/site-packages/py2exe/run_w.exe.backup drive_c/Python27/Lib/site-packages/py2exe/run_w.exe "$SCRIPT_DIR/SizeOfImage_w.patch"
|
||||
|
||||
cd -
|
||||
|
||||
else
|
||||
|
||||
export WINEPREFIX="$( cd wine-py2exe && pwd )"
|
||||
|
||||
fi
|
||||
|
||||
wine "C:\\Python27\\python.exe" "$1" py2exe > "py2exe.log" 2>&1 || true
|
||||
echo '# Copying python27.dll' >> "py2exe.log"
|
||||
cp "$WINEPREFIX/drive_c/windows/system32/python27.dll" build/bdist.win32/winexe/bundle-2.7/
|
||||
wine "C:\\Python27\\python.exe" "$1" py2exe >> "py2exe.log" 2>&1
|
||||
|
||||
@@ -1,12 +1,11 @@
|
||||
#!/usr/bin/env python3
|
||||
from __future__ import unicode_literals
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
from os.path import dirname as dirn
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
sys.path.insert(0, dirn(dirn((os.path.abspath(__file__)))))
|
||||
import yt_dlp
|
||||
|
||||
ZSH_COMPLETION_FILE = "completions/zsh/_yt-dlp"
|
||||
@@ -46,5 +45,5 @@ def build_completion(opt_parser):
|
||||
f.write(template)
|
||||
|
||||
|
||||
parser = yt_dlp.parseOpts(ignore_config_files=True)[0]
|
||||
parser = yt_dlp.parseOpts()[0]
|
||||
build_completion(parser)
|
||||
|
||||
1
docs/.gitignore
vendored
Normal file
1
docs/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
_build/
|
||||
5
docs/Changelog.md
Normal file
5
docs/Changelog.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
orphan: true
|
||||
---
|
||||
```{include} ../Changelog.md
|
||||
```
|
||||
5
docs/Collaborators.md
Normal file
5
docs/Collaborators.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
orphan: true
|
||||
---
|
||||
```{include} ../Collaborators.md
|
||||
```
|
||||
5
docs/Contributing.md
Normal file
5
docs/Contributing.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
orphan: true
|
||||
---
|
||||
```{include} ../Contributing.md
|
||||
```
|
||||
6
docs/LICENSE.md
Normal file
6
docs/LICENSE.md
Normal file
@@ -0,0 +1,6 @@
|
||||
---
|
||||
orphan: true
|
||||
---
|
||||
# LICENSE
|
||||
```{include} ../LICENSE
|
||||
```
|
||||
177
docs/Makefile
Normal file
177
docs/Makefile
Normal file
@@ -0,0 +1,177 @@
|
||||
# Makefile for Sphinx documentation
|
||||
#
|
||||
|
||||
# You can set these variables from the command line.
|
||||
SPHINXOPTS =
|
||||
SPHINXBUILD = sphinx-build
|
||||
PAPER =
|
||||
BUILDDIR = _build
|
||||
|
||||
# User-friendly check for sphinx-build
|
||||
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
|
||||
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
|
||||
endif
|
||||
|
||||
# Internal variables.
|
||||
PAPEROPT_a4 = -D latex_paper_size=a4
|
||||
PAPEROPT_letter = -D latex_paper_size=letter
|
||||
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||
# the i18n builder cannot share the environment and doctrees with the others
|
||||
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
|
||||
|
||||
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
|
||||
|
||||
help:
|
||||
@echo "Please use \`make <target>' where <target> is one of"
|
||||
@echo " html to make standalone HTML files"
|
||||
@echo " dirhtml to make HTML files named index.html in directories"
|
||||
@echo " singlehtml to make a single large HTML file"
|
||||
@echo " pickle to make pickle files"
|
||||
@echo " json to make JSON files"
|
||||
@echo " htmlhelp to make HTML files and a HTML help project"
|
||||
@echo " qthelp to make HTML files and a qthelp project"
|
||||
@echo " devhelp to make HTML files and a Devhelp project"
|
||||
@echo " epub to make an epub"
|
||||
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
|
||||
@echo " latexpdf to make LaTeX files and run them through pdflatex"
|
||||
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
|
||||
@echo " text to make text files"
|
||||
@echo " man to make manual pages"
|
||||
@echo " texinfo to make Texinfo files"
|
||||
@echo " info to make Texinfo files and run them through makeinfo"
|
||||
@echo " gettext to make PO message catalogs"
|
||||
@echo " changes to make an overview of all changed/added/deprecated items"
|
||||
@echo " xml to make Docutils-native XML files"
|
||||
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
|
||||
@echo " linkcheck to check all external links for integrity"
|
||||
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
|
||||
|
||||
clean:
|
||||
rm -rf $(BUILDDIR)/*
|
||||
|
||||
html:
|
||||
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
|
||||
|
||||
dirhtml:
|
||||
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
|
||||
@echo
|
||||
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
|
||||
|
||||
singlehtml:
|
||||
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
|
||||
@echo
|
||||
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
|
||||
|
||||
pickle:
|
||||
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
|
||||
@echo
|
||||
@echo "Build finished; now you can process the pickle files."
|
||||
|
||||
json:
|
||||
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
|
||||
@echo
|
||||
@echo "Build finished; now you can process the JSON files."
|
||||
|
||||
htmlhelp:
|
||||
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
|
||||
@echo
|
||||
@echo "Build finished; now you can run HTML Help Workshop with the" \
|
||||
".hhp project file in $(BUILDDIR)/htmlhelp."
|
||||
|
||||
qthelp:
|
||||
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
|
||||
@echo
|
||||
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
|
||||
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
|
||||
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/yt-dlp.qhcp"
|
||||
@echo "To view the help file:"
|
||||
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/yt-dlp.qhc"
|
||||
|
||||
devhelp:
|
||||
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
|
||||
@echo
|
||||
@echo "Build finished."
|
||||
@echo "To view the help file:"
|
||||
@echo "# mkdir -p $$HOME/.local/share/devhelp/yt-dlp"
|
||||
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/yt-dlp"
|
||||
@echo "# devhelp"
|
||||
|
||||
epub:
|
||||
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
|
||||
@echo
|
||||
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
|
||||
|
||||
latex:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo
|
||||
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
|
||||
@echo "Run \`make' in that directory to run these through (pdf)latex" \
|
||||
"(use \`make latexpdf' here to do that automatically)."
|
||||
|
||||
latexpdf:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo "Running LaTeX files through pdflatex..."
|
||||
$(MAKE) -C $(BUILDDIR)/latex all-pdf
|
||||
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||
|
||||
latexpdfja:
|
||||
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
|
||||
@echo "Running LaTeX files through platex and dvipdfmx..."
|
||||
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
|
||||
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
|
||||
|
||||
text:
|
||||
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
|
||||
@echo
|
||||
@echo "Build finished. The text files are in $(BUILDDIR)/text."
|
||||
|
||||
man:
|
||||
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
|
||||
@echo
|
||||
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
|
||||
|
||||
texinfo:
|
||||
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||
@echo
|
||||
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
|
||||
@echo "Run \`make' in that directory to run these through makeinfo" \
|
||||
"(use \`make info' here to do that automatically)."
|
||||
|
||||
info:
|
||||
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
|
||||
@echo "Running Texinfo files through makeinfo..."
|
||||
make -C $(BUILDDIR)/texinfo info
|
||||
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
|
||||
|
||||
gettext:
|
||||
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
|
||||
@echo
|
||||
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
|
||||
|
||||
changes:
|
||||
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
|
||||
@echo
|
||||
@echo "The overview file is in $(BUILDDIR)/changes."
|
||||
|
||||
linkcheck:
|
||||
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
|
||||
@echo
|
||||
@echo "Link check complete; look for any errors in the above output " \
|
||||
"or in $(BUILDDIR)/linkcheck/output.txt."
|
||||
|
||||
doctest:
|
||||
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
|
||||
@echo "Testing of doctests in the sources finished, look at the " \
|
||||
"results in $(BUILDDIR)/doctest/output.txt."
|
||||
|
||||
xml:
|
||||
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
|
||||
@echo
|
||||
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
|
||||
|
||||
pseudoxml:
|
||||
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
|
||||
@echo
|
||||
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."
|
||||
2
docs/README.md
Normal file
2
docs/README.md
Normal file
@@ -0,0 +1,2 @@
|
||||
```{include} ../README.md
|
||||
```
|
||||
68
docs/conf.py
Normal file
68
docs/conf.py
Normal file
@@ -0,0 +1,68 @@
|
||||
# coding: utf-8
|
||||
#
|
||||
# yt-dlp documentation build configuration file
|
||||
|
||||
import sys
|
||||
import os
|
||||
|
||||
# Allows to import yt-dlp
|
||||
sys.path.insert(0, os.path.abspath('..'))
|
||||
|
||||
# -- General configuration ------------------------------------------------
|
||||
|
||||
# Add any Sphinx extension module names here, as strings. They can be
|
||||
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
|
||||
# ones.
|
||||
extensions = [
|
||||
'myst_parser',
|
||||
]
|
||||
|
||||
# Add any paths that contain templates here, relative to this directory.
|
||||
templates_path = ['_templates']
|
||||
|
||||
# The master toctree document.
|
||||
master_doc = 'README'
|
||||
|
||||
# General information about the project.
|
||||
project = u'yt-dlp'
|
||||
author = u'yt-dlp'
|
||||
copyright = u'UNLICENSE'
|
||||
|
||||
# The version info for the project you're documenting, acts as replacement for
|
||||
# |version| and |release|, also used in various other places throughout the
|
||||
# built documents.
|
||||
#
|
||||
# The short X.Y version.
|
||||
from yt_dlp.version import __version__
|
||||
version = __version__
|
||||
# The full version, including alpha/beta/rc tags.
|
||||
release = version
|
||||
|
||||
# List of patterns, relative to source directory, that match files and
|
||||
# directories to ignore when looking for source files.
|
||||
exclude_patterns = ['_build']
|
||||
|
||||
# The name of the Pygments (syntax highlighting) style to use.
|
||||
pygments_style = 'sphinx'
|
||||
|
||||
# -- Options for HTML output ----------------------------------------------
|
||||
|
||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||
# a list of builtin themes.
|
||||
html_theme = 'default'
|
||||
|
||||
# Disable highlights
|
||||
highlight_language = 'none'
|
||||
|
||||
# Add any paths that contain custom static files (such as style sheets) here,
|
||||
# relative to this directory. They are copied after the builtin static files,
|
||||
# so a file named "default.css" will overwrite the builtin "default.css".
|
||||
# html_static_path = ['_static']
|
||||
|
||||
# Enable heading anchors
|
||||
myst_heading_anchors = 4
|
||||
|
||||
# Suppress heading warnings
|
||||
suppress_warnings = [
|
||||
'myst.header',
|
||||
]
|
||||
1
docs/requirements.txt
Normal file
1
docs/requirements.txt
Normal file
@@ -0,0 +1 @@
|
||||
myst-parser
|
||||
5
docs/supportedsites.md
Normal file
5
docs/supportedsites.md
Normal file
@@ -0,0 +1,5 @@
|
||||
---
|
||||
orphan: true
|
||||
---
|
||||
```{include} ../supportedsites.md
|
||||
```
|
||||
6
docs/ytdlp_plugins.md
Normal file
6
docs/ytdlp_plugins.md
Normal file
@@ -0,0 +1,6 @@
|
||||
---
|
||||
orphan: true
|
||||
---
|
||||
# ytdlp_plugins
|
||||
|
||||
See [https://github.com/yt-dlp/yt-dlp/tree/master/ytdlp_plugins](https://github.com/yt-dlp/yt-dlp/tree/master/ytdlp_plugins).
|
||||
29
public.key
29
public.key
@@ -1,29 +0,0 @@
|
||||
-----BEGIN PGP PUBLIC KEY BLOCK-----
|
||||
|
||||
mQINBGP78C4BEAD0rF9zjGPAt0thlt5C1ebzccAVX7Nb1v+eqQjk+WEZdTETVCg3
|
||||
WAM5ngArlHdm/fZqzUgO+pAYrB60GKeg7ffUDf+S0XFKEZdeRLYeAaqqKhSibVal
|
||||
DjvOBOztu3W607HLETQAqA7wTPuIt2WqmpL60NIcyr27LxqmgdN3mNvZ2iLO+bP0
|
||||
nKR/C+PgE9H4ytywDa12zMx6PmZCnVOOOu6XZEFmdUxxdQ9fFDqd9LcBKY2LDOcS
|
||||
Yo1saY0YWiZWHtzVoZu1kOzjnS5Fjq/yBHJLImDH7pNxHm7s/PnaurpmQFtDFruk
|
||||
t+2lhDnpKUmGr/I/3IHqH/X+9nPoS4uiqQ5HpblB8BK+4WfpaiEg75LnvuOPfZIP
|
||||
KYyXa/0A7QojMwgOrD88ozT+VCkKkkJ+ijXZ7gHNjmcBaUdKK7fDIEOYI63Lyc6Q
|
||||
WkGQTigFffSUXWHDCO9aXNhP3ejqFWgGMtCUsrbkcJkWuWY7q5ARy/05HbSM3K4D
|
||||
U9eqtnxmiV1WQ8nXuI9JgJQRvh5PTkny5LtxqzcmqvWO9TjHBbrs14BPEO9fcXxK
|
||||
L/CFBbzXDSvvAgArdqqlMoncQ/yicTlfL6qzJ8EKFiqW14QMTdAn6SuuZTodXCTi
|
||||
InwoT7WjjuFPKKdvfH1GP4bnqdzTnzLxCSDIEtfyfPsIX+9GI7Jkk/zZjQARAQAB
|
||||
tDdTaW1vbiBTYXdpY2tpICh5dC1kbHAgc2lnbmluZyBrZXkpIDxjb250YWN0QGdy
|
||||
dWI0ay54eXo+iQJOBBMBCgA4FiEErAy75oSNaoc0ZK9OV89lkztadYEFAmP78C4C
|
||||
GwMFCwkIBwIGFQoJCAsCBBYCAwECHgECF4AACgkQV89lkztadYEVqQ//cW7TxhXg
|
||||
7Xbh2EZQzXml0egn6j8QaV9KzGragMiShrlvTO2zXfLXqyizrFP4AspgjSn/4NrI
|
||||
8mluom+Yi+qr7DXT4BjQqIM9y3AjwZPdywe912Lxcw52NNoPZCm24I9T7ySc8lmR
|
||||
FQvZC0w4H/VTNj/2lgJ1dwMflpwvNRiWa5YzcFGlCUeDIPskLx9++AJE+xwU3LYm
|
||||
jQQsPBqpHHiTBEJzMLl+rfd9Fg4N+QNzpFkTDW3EPerLuvJniSBBwZthqxeAtw4M
|
||||
UiAXh6JvCc2hJkKCoygRfM281MeolvmsGNyQm+axlB0vyldiPP6BnaRgZlx+l6MU
|
||||
cPqgHblb7RW5j9lfr6OYL7SceBIHNv0CFrt1OnkGo/tVMwcs8LH3Ae4a7UJlIceL
|
||||
V54aRxSsZU7w4iX+PB79BWkEsQzwKrUuJVOeL4UDwWajp75OFaUqbS/slDDVXvK5
|
||||
OIeuth3mA/adjdvgjPxhRQjA3l69rRWIJDrqBSHldmRsnX6cvXTDy8wSXZgy51lP
|
||||
m4IVLHnCy9m4SaGGoAsfTZS0cC9FgjUIyTyrq9M67wOMpUxnuB0aRZgJE1DsI23E
|
||||
qdvcSNVlO+39xM/KPWUEh6b83wMn88QeW+DCVGWACQq5N3YdPnAJa50617fGbY6I
|
||||
gXIoRHXkDqe23PZ/jURYCv0sjVtjPoVC+bg=
|
||||
=bJkn
|
||||
-----END PGP PUBLIC KEY BLOCK-----
|
||||
113
pyinst.py
113
pyinst.py
@@ -1,31 +1,34 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
# coding: utf-8
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
import platform
|
||||
import sys
|
||||
from PyInstaller.utils.hooks import collect_submodules
|
||||
|
||||
from PyInstaller.__main__ import run as run_pyinstaller
|
||||
|
||||
from devscripts.utils import read_version
|
||||
OS_NAME = platform.system()
|
||||
if OS_NAME == 'Windows':
|
||||
from PyInstaller.utils.win32.versioninfo import (
|
||||
VarStruct, VarFileInfo, StringStruct, StringTable,
|
||||
StringFileInfo, FixedFileInfo, VSVersionInfo, SetVersion,
|
||||
)
|
||||
elif OS_NAME == 'Darwin':
|
||||
pass
|
||||
else:
|
||||
raise Exception('{OS_NAME} is not supported')
|
||||
|
||||
OS_NAME, MACHINE, ARCH = sys.platform, platform.machine().lower(), platform.architecture()[0][:2]
|
||||
if MACHINE in ('x86', 'x86_64', 'amd64', 'i386', 'i686'):
|
||||
MACHINE = 'x86' if ARCH == '32' else ''
|
||||
ARCH = platform.architecture()[0][:2]
|
||||
|
||||
|
||||
def main():
|
||||
opts, version = parse_options(), read_version()
|
||||
opts = parse_options()
|
||||
version = read_version()
|
||||
|
||||
onedir = '--onedir' in opts or '-D' in opts
|
||||
if not onedir and '-F' not in opts and '--onefile' not in opts:
|
||||
opts.append('--onefile')
|
||||
suffix = '_macos' if OS_NAME == 'Darwin' else '_x86' if ARCH == '32' else ''
|
||||
final_file = 'dist/%syt-dlp%s%s' % (
|
||||
'yt-dlp/' if '--onedir' in opts else '', suffix, '.exe' if OS_NAME == 'Windows' else '')
|
||||
|
||||
name, final_file = exe(onedir)
|
||||
print(f'Building yt-dlp v{version} for {OS_NAME} {platform.machine()} with options {opts}')
|
||||
print(f'Building yt-dlp v{version} {ARCH}bit for {OS_NAME} with options {opts}')
|
||||
print('Remember to update the version using "devscripts/update-version.py"')
|
||||
if not os.path.isfile('yt_dlp/extractor/lazy_extractors.py'):
|
||||
print('WARNING: Building without lazy_extractors. Run '
|
||||
@@ -33,43 +36,36 @@ def main():
|
||||
print(f'Destination: {final_file}\n')
|
||||
|
||||
opts = [
|
||||
f'--name={name}',
|
||||
f'--name=yt-dlp{suffix}',
|
||||
'--icon=devscripts/logo.ico',
|
||||
'--upx-exclude=vcruntime140.dll',
|
||||
'--noconfirm',
|
||||
'--additional-hooks-dir=yt_dlp/__pyinstaller',
|
||||
*dependency_options(),
|
||||
*opts,
|
||||
'yt_dlp/__main__.py',
|
||||
]
|
||||
|
||||
print(f'Running PyInstaller with {opts}')
|
||||
run_pyinstaller(opts)
|
||||
|
||||
import PyInstaller.__main__
|
||||
|
||||
PyInstaller.__main__.run(opts)
|
||||
|
||||
set_version_info(final_file, version)
|
||||
|
||||
|
||||
def parse_options():
|
||||
# Compatibility with older arguments
|
||||
# Compatability with older arguments
|
||||
opts = sys.argv[1:]
|
||||
if opts[0:1] in (['32'], ['64']):
|
||||
if ARCH != opts[0]:
|
||||
raise Exception(f'{opts[0]}bit executable cannot be built on a {ARCH}bit system')
|
||||
opts = opts[1:]
|
||||
return opts
|
||||
return opts or ['--onefile']
|
||||
|
||||
|
||||
def exe(onedir):
|
||||
"""@returns (name, path)"""
|
||||
name = '_'.join(filter(None, (
|
||||
'yt-dlp',
|
||||
{'win32': '', 'darwin': 'macos'}.get(OS_NAME, OS_NAME),
|
||||
MACHINE,
|
||||
)))
|
||||
return name, ''.join(filter(None, (
|
||||
'dist/',
|
||||
onedir and f'{name}/',
|
||||
name,
|
||||
OS_NAME == 'win32' and '.exe'
|
||||
)))
|
||||
def read_version():
|
||||
exec(compile(open('yt_dlp/version.py').read(), 'yt_dlp/version.py', 'exec'))
|
||||
return locals()['__version__']
|
||||
|
||||
|
||||
def version_to_list(version):
|
||||
@@ -77,29 +73,36 @@ def version_to_list(version):
|
||||
return list(map(int, version_list)) + [0] * (4 - len(version_list))
|
||||
|
||||
|
||||
def dependency_options():
|
||||
dependencies = [pycryptodome_module(), 'mutagen'] + collect_submodules('websockets')
|
||||
excluded_modules = ['test', 'ytdlp_plugins', 'youtube-dl', 'youtube-dlc']
|
||||
|
||||
yield from (f'--hidden-import={module}' for module in dependencies)
|
||||
yield from (f'--exclude-module={module}' for module in excluded_modules)
|
||||
|
||||
|
||||
def pycryptodome_module():
|
||||
try:
|
||||
import Cryptodome # noqa: F401
|
||||
except ImportError:
|
||||
try:
|
||||
import Crypto # noqa: F401
|
||||
print('WARNING: Using Crypto since Cryptodome is not available. '
|
||||
'Install with: pip install pycryptodomex', file=sys.stderr)
|
||||
return 'Crypto'
|
||||
except ImportError:
|
||||
pass
|
||||
return 'Cryptodome'
|
||||
|
||||
|
||||
def set_version_info(exe, version):
|
||||
if OS_NAME == 'win32':
|
||||
if OS_NAME == 'Windows':
|
||||
windows_set_version(exe, version)
|
||||
|
||||
|
||||
def windows_set_version(exe, version):
|
||||
from PyInstaller.utils.win32.versioninfo import (
|
||||
FixedFileInfo,
|
||||
StringFileInfo,
|
||||
StringStruct,
|
||||
StringTable,
|
||||
VarFileInfo,
|
||||
VarStruct,
|
||||
VSVersionInfo,
|
||||
)
|
||||
|
||||
try:
|
||||
from PyInstaller.utils.win32.versioninfo import SetVersion
|
||||
except ImportError: # Pyinstaller >= 5.8
|
||||
from PyInstaller.utils.win32.versioninfo import write_version_info_to_executable as SetVersion
|
||||
|
||||
version_list = version_to_list(version)
|
||||
suffix = MACHINE and f'_{MACHINE}'
|
||||
suffix = '_x86' if ARCH == '32' else ''
|
||||
SetVersion(exe, VSVersionInfo(
|
||||
ffi=FixedFileInfo(
|
||||
filevers=version_list,
|
||||
@@ -113,9 +116,9 @@ def windows_set_version(exe, version):
|
||||
),
|
||||
kids=[
|
||||
StringFileInfo([StringTable('040904B0', [
|
||||
StringStruct('Comments', 'yt-dlp%s Command Line Interface' % suffix),
|
||||
StringStruct('Comments', 'yt-dlp%s Command Line Interface.' % suffix),
|
||||
StringStruct('CompanyName', 'https://github.com/yt-dlp'),
|
||||
StringStruct('FileDescription', 'yt-dlp%s' % (MACHINE and f' ({MACHINE})')),
|
||||
StringStruct('FileDescription', 'yt-dlp%s' % (' (32 Bit)' if ARCH == '32' else '')),
|
||||
StringStruct('FileVersion', version),
|
||||
StringStruct('InternalName', f'yt-dlp{suffix}'),
|
||||
StringStruct('LegalCopyright', 'pukkandan.ytdlp@gmail.com | UNLICENSE'),
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
[build-system]
|
||||
build-backend = 'setuptools.build_meta'
|
||||
# https://github.com/yt-dlp/yt-dlp/issues/5941
|
||||
# https://github.com/pypa/distutils/issues/17
|
||||
requires = ['setuptools > 50']
|
||||
4
pytest.ini
Normal file
4
pytest.ini
Normal file
@@ -0,0 +1,4 @@
|
||||
[pytest]
|
||||
addopts = -ra -v --strict-markers
|
||||
markers =
|
||||
download
|
||||
@@ -1,6 +1,3 @@
|
||||
mutagen
|
||||
pycryptodomex
|
||||
websockets
|
||||
brotli; platform_python_implementation=='CPython'
|
||||
brotlicffi; platform_python_implementation!='CPython'
|
||||
certifi
|
||||
|
||||
49
setup.cfg
49
setup.cfg
@@ -1,49 +1,6 @@
|
||||
[wheel]
|
||||
universal = true
|
||||
|
||||
universal = True
|
||||
|
||||
[flake8]
|
||||
exclude = build,venv,.tox,.git,.pytest_cache
|
||||
ignore = E402,E501,E731,E741,W503
|
||||
max_line_length = 120
|
||||
per_file_ignores =
|
||||
devscripts/lazy_load_template.py: F401
|
||||
|
||||
|
||||
[autoflake]
|
||||
ignore-init-module-imports = true
|
||||
ignore-pass-after-docstring = true
|
||||
remove-all-unused-imports = true
|
||||
remove-duplicate-keys = true
|
||||
remove-unused-variables = true
|
||||
|
||||
|
||||
[tool:pytest]
|
||||
addopts = -ra -v --strict-markers
|
||||
markers =
|
||||
download
|
||||
|
||||
|
||||
[tox:tox]
|
||||
skipsdist = true
|
||||
envlist = py{36,37,38,39,310,311},pypy{36,37,38,39}
|
||||
skip_missing_interpreters = true
|
||||
|
||||
[testenv] # tox
|
||||
deps =
|
||||
pytest
|
||||
commands = pytest {posargs:"-m not download"}
|
||||
passenv = HOME # For test_compat_expanduser
|
||||
setenv =
|
||||
# PYTHONWARNINGS = error # Catches PIP's warnings too
|
||||
|
||||
|
||||
[isort]
|
||||
py_version = 37
|
||||
multi_line_output = VERTICAL_HANGING_INDENT
|
||||
line_length = 80
|
||||
reverse_relative = true
|
||||
ensure_newline_before_comments = true
|
||||
include_trailing_comma = true
|
||||
known_first_party =
|
||||
test
|
||||
exclude = yt_dlp/extractor/__init__.py,devscripts/buildserver.py,devscripts/lazy_load_template.py,devscripts/make_issue_template.py,setup.py,build,.git,venv,devscripts/create-github-release.py,devscripts/release.sh,devscripts/show-downloads-statistics.py
|
||||
ignore = E402,E501,E731,E741,W503
|
||||
183
setup.py
183
setup.py
@@ -1,77 +1,60 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
# Allow execution from anywhere
|
||||
import os
|
||||
# coding: utf-8
|
||||
import os.path
|
||||
import warnings
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
import subprocess
|
||||
import warnings
|
||||
|
||||
try:
|
||||
from setuptools import Command, find_packages, setup
|
||||
from setuptools import setup, Command, find_packages
|
||||
setuptools_available = True
|
||||
except ImportError:
|
||||
from distutils.core import Command, setup
|
||||
from distutils.core import setup, Command
|
||||
setuptools_available = False
|
||||
from distutils.spawn import spawn
|
||||
|
||||
from devscripts.utils import read_file, read_version
|
||||
# Get the version from yt_dlp/version.py without importing the package
|
||||
exec(compile(open('yt_dlp/version.py').read(), 'yt_dlp/version.py', 'exec'))
|
||||
|
||||
VERSION = read_version()
|
||||
|
||||
DESCRIPTION = 'A youtube-dl fork with additional features and patches'
|
||||
|
||||
LONG_DESCRIPTION = '\n\n'.join((
|
||||
'Official repository: <https://github.com/yt-dlp/yt-dlp>',
|
||||
'**PS**: Some links in this document will not work since this is a copy of the README.md from Github',
|
||||
read_file('README.md')))
|
||||
open('README.md', 'r', encoding='utf-8').read()))
|
||||
|
||||
REQUIREMENTS = read_file('requirements.txt').splitlines()
|
||||
REQUIREMENTS = ['mutagen', 'pycryptodomex', 'websockets']
|
||||
|
||||
|
||||
def packages():
|
||||
if setuptools_available:
|
||||
return find_packages(exclude=('youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins', 'devscripts'))
|
||||
|
||||
return [
|
||||
'yt_dlp', 'yt_dlp.extractor', 'yt_dlp.downloader', 'yt_dlp.postprocessor', 'yt_dlp.compat',
|
||||
]
|
||||
|
||||
|
||||
def py2exe_params():
|
||||
if sys.argv[1:2] == ['py2exe']:
|
||||
import py2exe
|
||||
warnings.warn(
|
||||
'py2exe builds do not support pycryptodomex and needs VC++14 to run. '
|
||||
'It is recommended to run "pyinst.py" to build using pyinstaller instead')
|
||||
|
||||
return {
|
||||
'The recommended way is to use "pyinst.py" to build using pyinstaller')
|
||||
params = {
|
||||
'console': [{
|
||||
'script': './yt_dlp/__main__.py',
|
||||
'dest_base': 'yt-dlp',
|
||||
'icon_resources': [(1, 'devscripts/logo.ico')],
|
||||
}],
|
||||
'version_info': {
|
||||
'version': VERSION,
|
||||
'version': __version__,
|
||||
'description': DESCRIPTION,
|
||||
'comments': LONG_DESCRIPTION.split('\n')[0],
|
||||
'product_name': 'yt-dlp',
|
||||
'product_version': VERSION,
|
||||
},
|
||||
'product_version': __version__,
|
||||
}],
|
||||
'options': {
|
||||
'bundle_files': 0,
|
||||
'compressed': 1,
|
||||
'optimize': 2,
|
||||
'dist_dir': './dist',
|
||||
'excludes': ['Crypto', 'Cryptodome'], # py2exe cannot import Crypto
|
||||
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
|
||||
# Modules that are only imported dynamically must be added here
|
||||
'includes': ['yt_dlp.compat._legacy'],
|
||||
'py2exe': {
|
||||
'bundle_files': 0,
|
||||
'compressed': 1,
|
||||
'optimize': 2,
|
||||
'dist_dir': './dist',
|
||||
'excludes': ['Crypto', 'Cryptodome'], # py2exe cannot import Crypto
|
||||
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
|
||||
}
|
||||
},
|
||||
'zipfile': None,
|
||||
'zipfile': None
|
||||
}
|
||||
|
||||
|
||||
def build_params():
|
||||
else:
|
||||
files_spec = [
|
||||
('share/bash-completion/completions', ['completions/bash/yt-dlp']),
|
||||
('share/zsh/site-functions', ['completions/zsh/_yt-dlp']),
|
||||
@@ -79,26 +62,25 @@ def build_params():
|
||||
('share/doc/yt_dlp', ['README.txt']),
|
||||
('share/man/man1', ['yt-dlp.1'])
|
||||
]
|
||||
root = os.path.dirname(os.path.abspath(__file__))
|
||||
data_files = []
|
||||
for dirname, files in files_spec:
|
||||
resfiles = []
|
||||
for fn in files:
|
||||
if not os.path.exists(fn):
|
||||
warnings.warn(f'Skipping file {fn} since it is not present. Try running " make pypi-files " first')
|
||||
warnings.warn('Skipping file %s since it is not present. Try running `make pypi-files` first' % fn)
|
||||
else:
|
||||
resfiles.append(fn)
|
||||
data_files.append((dirname, resfiles))
|
||||
|
||||
params = {'data_files': data_files}
|
||||
params = {
|
||||
'data_files': data_files,
|
||||
}
|
||||
|
||||
if setuptools_available:
|
||||
params['entry_points'] = {
|
||||
'console_scripts': ['yt-dlp = yt_dlp:main'],
|
||||
'pyinstaller40': ['hook-dirs = yt_dlp.__pyinstaller:get_hook_dirs'],
|
||||
}
|
||||
params['entry_points'] = {'console_scripts': ['yt-dlp = yt_dlp:main']}
|
||||
else:
|
||||
params['scripts'] = ['yt-dlp']
|
||||
return params
|
||||
|
||||
|
||||
class build_lazy_extractors(Command):
|
||||
@@ -112,64 +94,49 @@ def finalize_options(self):
|
||||
pass
|
||||
|
||||
def run(self):
|
||||
if self.dry_run:
|
||||
print('Skipping build of lazy extractors in dry run mode')
|
||||
return
|
||||
subprocess.run([sys.executable, 'devscripts/make_lazy_extractors.py'])
|
||||
spawn([sys.executable, 'devscripts/make_lazy_extractors.py', 'yt_dlp/extractor/lazy_extractors.py'],
|
||||
dry_run=self.dry_run)
|
||||
|
||||
|
||||
def main():
|
||||
if sys.argv[1:2] == ['py2exe']:
|
||||
params = py2exe_params()
|
||||
try:
|
||||
from py2exe import freeze
|
||||
except ImportError:
|
||||
import py2exe # noqa: F401
|
||||
warnings.warn('You are using an outdated version of py2exe. Support for this version will be removed in the future')
|
||||
params['console'][0].update(params.pop('version_info'))
|
||||
params['options'] = {'py2exe': params.pop('options')}
|
||||
else:
|
||||
return freeze(**params)
|
||||
else:
|
||||
params = build_params()
|
||||
|
||||
setup(
|
||||
name='yt-dlp',
|
||||
version=VERSION,
|
||||
maintainer='pukkandan',
|
||||
maintainer_email='pukkandan.ytdlp@gmail.com',
|
||||
description=DESCRIPTION,
|
||||
long_description=LONG_DESCRIPTION,
|
||||
long_description_content_type='text/markdown',
|
||||
url='https://github.com/yt-dlp/yt-dlp',
|
||||
packages=packages(),
|
||||
install_requires=REQUIREMENTS,
|
||||
python_requires='>=3.7',
|
||||
project_urls={
|
||||
'Documentation': 'https://github.com/yt-dlp/yt-dlp#readme',
|
||||
'Source': 'https://github.com/yt-dlp/yt-dlp',
|
||||
'Tracker': 'https://github.com/yt-dlp/yt-dlp/issues',
|
||||
'Funding': 'https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators',
|
||||
},
|
||||
classifiers=[
|
||||
'Topic :: Multimedia :: Video',
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Console',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: Implementation',
|
||||
'Programming Language :: Python :: Implementation :: CPython',
|
||||
'Programming Language :: Python :: Implementation :: PyPy',
|
||||
'License :: Public Domain',
|
||||
'Operating System :: OS Independent',
|
||||
],
|
||||
cmdclass={'build_lazy_extractors': build_lazy_extractors},
|
||||
**params
|
||||
)
|
||||
if setuptools_available:
|
||||
packages = find_packages(exclude=('youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins'))
|
||||
else:
|
||||
packages = ['yt_dlp', 'yt_dlp.downloader', 'yt_dlp.extractor', 'yt_dlp.postprocessor']
|
||||
|
||||
|
||||
main()
|
||||
setup(
|
||||
name='yt-dlp',
|
||||
version=__version__,
|
||||
maintainer='pukkandan',
|
||||
maintainer_email='pukkandan.ytdlp@gmail.com',
|
||||
description=DESCRIPTION,
|
||||
long_description=LONG_DESCRIPTION,
|
||||
long_description_content_type='text/markdown',
|
||||
url='https://github.com/yt-dlp/yt-dlp',
|
||||
packages=packages,
|
||||
install_requires=REQUIREMENTS,
|
||||
project_urls={
|
||||
'Documentation': 'https://yt-dlp.readthedocs.io',
|
||||
'Source': 'https://github.com/yt-dlp/yt-dlp',
|
||||
'Tracker': 'https://github.com/yt-dlp/yt-dlp/issues',
|
||||
'Funding': 'https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators',
|
||||
},
|
||||
classifiers=[
|
||||
'Topic :: Multimedia :: Video',
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Console',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3.6',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: Implementation',
|
||||
'Programming Language :: Python :: Implementation :: CPython',
|
||||
'Programming Language :: Python :: Implementation :: PyPy',
|
||||
'License :: Public Domain',
|
||||
'Operating System :: OS Independent',
|
||||
],
|
||||
python_requires='>=3.6',
|
||||
|
||||
cmdclass={'build_lazy_extractors': build_lazy_extractors},
|
||||
**params
|
||||
)
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
140
test/helper.py
140
test/helper.py
@@ -1,16 +1,26 @@
|
||||
from __future__ import unicode_literals
|
||||
|
||||
import errno
|
||||
import io
|
||||
import hashlib
|
||||
import json
|
||||
import os.path
|
||||
import re
|
||||
import types
|
||||
import ssl
|
||||
import sys
|
||||
import types
|
||||
|
||||
import yt_dlp.extractor
|
||||
from yt_dlp import YoutubeDL
|
||||
from yt_dlp.compat import compat_os_name
|
||||
from yt_dlp.utils import preferredencoding, write_string
|
||||
from yt_dlp.compat import (
|
||||
compat_os_name,
|
||||
compat_str,
|
||||
)
|
||||
from yt_dlp.utils import (
|
||||
preferredencoding,
|
||||
write_string,
|
||||
)
|
||||
|
||||
|
||||
if 'pytest' in sys.modules:
|
||||
import pytest
|
||||
@@ -25,10 +35,10 @@ def get_params(override=None):
|
||||
'parameters.json')
|
||||
LOCAL_PARAMETERS_FILE = os.path.join(os.path.dirname(os.path.abspath(__file__)),
|
||||
'local_parameters.json')
|
||||
with open(PARAMETERS_FILE, encoding='utf-8') as pf:
|
||||
with io.open(PARAMETERS_FILE, encoding='utf-8') as pf:
|
||||
parameters = json.load(pf)
|
||||
if os.path.exists(LOCAL_PARAMETERS_FILE):
|
||||
with open(LOCAL_PARAMETERS_FILE, encoding='utf-8') as pf:
|
||||
with io.open(LOCAL_PARAMETERS_FILE, encoding='utf-8') as pf:
|
||||
parameters.update(json.load(pf))
|
||||
if override:
|
||||
parameters.update(override)
|
||||
@@ -44,7 +54,7 @@ def try_rm(filename):
|
||||
raise
|
||||
|
||||
|
||||
def report_warning(message, *args, **kwargs):
|
||||
def report_warning(message):
|
||||
'''
|
||||
Print the message to stderr, it will be prefixed with 'WARNING:'
|
||||
If stderr is a tty file the 'WARNING:' will be colored
|
||||
@@ -53,8 +63,8 @@ def report_warning(message, *args, **kwargs):
|
||||
_msg_header = '\033[0;33mWARNING:\033[0m'
|
||||
else:
|
||||
_msg_header = 'WARNING:'
|
||||
output = f'{_msg_header} {message}\n'
|
||||
if 'b' in getattr(sys.stderr, 'mode', ''):
|
||||
output = '%s %s\n' % (_msg_header, message)
|
||||
if 'b' in getattr(sys.stderr, 'mode', '') or sys.version_info[0] < 3:
|
||||
output = output.encode(preferredencoding())
|
||||
sys.stderr.write(output)
|
||||
|
||||
@@ -64,13 +74,13 @@ def __init__(self, override=None):
|
||||
# Different instances of the downloader can't share the same dictionary
|
||||
# some test set the "sublang" parameter, which would break the md5 checks.
|
||||
params = get_params(override=override)
|
||||
super().__init__(params, auto_init=False)
|
||||
super(FakeYDL, self).__init__(params, auto_init=False)
|
||||
self.result = []
|
||||
|
||||
def to_screen(self, s, *args, **kwargs):
|
||||
def to_screen(self, s, skip_eol=None):
|
||||
print(s)
|
||||
|
||||
def trouble(self, s, *args, **kwargs):
|
||||
def trouble(self, s, tb=None):
|
||||
raise Exception(s)
|
||||
|
||||
def download(self, x):
|
||||
@@ -80,59 +90,56 @@ def expect_warning(self, regex):
|
||||
# Silence an expected warning matching a regex
|
||||
old_report_warning = self.report_warning
|
||||
|
||||
def report_warning(self, message, *args, **kwargs):
|
||||
def report_warning(self, message):
|
||||
if re.match(regex, message):
|
||||
return
|
||||
old_report_warning(message, *args, **kwargs)
|
||||
old_report_warning(message)
|
||||
self.report_warning = types.MethodType(report_warning, self)
|
||||
|
||||
|
||||
def gettestcases(include_onlymatching=False):
|
||||
for ie in yt_dlp.extractor.gen_extractors():
|
||||
yield from ie.get_testcases(include_onlymatching)
|
||||
|
||||
|
||||
def getwebpagetestcases():
|
||||
for ie in yt_dlp.extractor.gen_extractors():
|
||||
for tc in ie.get_webpage_testcases():
|
||||
tc.setdefault('add_ie', []).append('Generic')
|
||||
for tc in ie.get_testcases(include_onlymatching):
|
||||
yield tc
|
||||
|
||||
|
||||
md5 = lambda s: hashlib.md5(s.encode()).hexdigest()
|
||||
md5 = lambda s: hashlib.md5(s.encode('utf-8')).hexdigest()
|
||||
|
||||
|
||||
def expect_value(self, got, expected, field):
|
||||
if isinstance(expected, str) and expected.startswith('re:'):
|
||||
if isinstance(expected, compat_str) and expected.startswith('re:'):
|
||||
match_str = expected[len('re:'):]
|
||||
match_rex = re.compile(match_str)
|
||||
|
||||
self.assertTrue(
|
||||
isinstance(got, str),
|
||||
f'Expected a {str.__name__} object, but got {type(got).__name__} for field {field}')
|
||||
isinstance(got, compat_str),
|
||||
'Expected a %s object, but got %s for field %s' % (
|
||||
compat_str.__name__, type(got).__name__, field))
|
||||
self.assertTrue(
|
||||
match_rex.match(got),
|
||||
f'field {field} (value: {got!r}) should match {match_str!r}')
|
||||
elif isinstance(expected, str) and expected.startswith('startswith:'):
|
||||
'field %s (value: %r) should match %r' % (field, got, match_str))
|
||||
elif isinstance(expected, compat_str) and expected.startswith('startswith:'):
|
||||
start_str = expected[len('startswith:'):]
|
||||
self.assertTrue(
|
||||
isinstance(got, str),
|
||||
f'Expected a {str.__name__} object, but got {type(got).__name__} for field {field}')
|
||||
isinstance(got, compat_str),
|
||||
'Expected a %s object, but got %s for field %s' % (
|
||||
compat_str.__name__, type(got).__name__, field))
|
||||
self.assertTrue(
|
||||
got.startswith(start_str),
|
||||
f'field {field} (value: {got!r}) should start with {start_str!r}')
|
||||
elif isinstance(expected, str) and expected.startswith('contains:'):
|
||||
'field %s (value: %r) should start with %r' % (field, got, start_str))
|
||||
elif isinstance(expected, compat_str) and expected.startswith('contains:'):
|
||||
contains_str = expected[len('contains:'):]
|
||||
self.assertTrue(
|
||||
isinstance(got, str),
|
||||
f'Expected a {str.__name__} object, but got {type(got).__name__} for field {field}')
|
||||
isinstance(got, compat_str),
|
||||
'Expected a %s object, but got %s for field %s' % (
|
||||
compat_str.__name__, type(got).__name__, field))
|
||||
self.assertTrue(
|
||||
contains_str in got,
|
||||
f'field {field} (value: {got!r}) should contain {contains_str!r}')
|
||||
'field %s (value: %r) should contain %r' % (field, got, contains_str))
|
||||
elif isinstance(expected, type):
|
||||
self.assertTrue(
|
||||
isinstance(got, expected),
|
||||
f'Expected type {expected!r} for field {field}, but got value {got!r} of type {type(got)!r}')
|
||||
'Expected type %r for field %s, but got value %r of type %r' % (expected, field, got, type(got)))
|
||||
elif isinstance(expected, dict) and isinstance(got, dict):
|
||||
expect_dict(self, got, expected)
|
||||
elif isinstance(expected, list) and isinstance(got, list):
|
||||
@@ -149,15 +156,16 @@ def expect_value(self, got, expected, field):
|
||||
index, field, type_expected, type_got))
|
||||
expect_value(self, item_got, item_expected, field)
|
||||
else:
|
||||
if isinstance(expected, str) and expected.startswith('md5:'):
|
||||
if isinstance(expected, compat_str) and expected.startswith('md5:'):
|
||||
self.assertTrue(
|
||||
isinstance(got, str),
|
||||
f'Expected field {field} to be a unicode object, but got value {got!r} of type {type(got)!r}')
|
||||
isinstance(got, compat_str),
|
||||
'Expected field %s to be a unicode object, but got value %r of type %r' % (field, got, type(got)))
|
||||
got = 'md5:' + md5(got)
|
||||
elif isinstance(expected, str) and re.match(r'^(?:min|max)?count:\d+', expected):
|
||||
elif isinstance(expected, compat_str) and re.match(r'^(?:min|max)?count:\d+', expected):
|
||||
self.assertTrue(
|
||||
isinstance(got, (list, dict)),
|
||||
f'Expected field {field} to be a list or a dict, but it is of type {type(got).__name__}')
|
||||
'Expected field %s to be a list or a dict, but it is of type %s' % (
|
||||
field, type(got).__name__))
|
||||
op, _, expected_num = expected.partition(':')
|
||||
expected_num = int(expected_num)
|
||||
if op == 'mincount':
|
||||
@@ -177,7 +185,7 @@ def expect_value(self, got, expected, field):
|
||||
return
|
||||
self.assertEqual(
|
||||
expected, got,
|
||||
f'Invalid value for field {field}, expected {expected!r}, got {got!r}')
|
||||
'Invalid value for field %s, expected %r, got %r' % (field, expected, got))
|
||||
|
||||
|
||||
def expect_dict(self, got_dict, expected_dict):
|
||||
@@ -188,7 +196,15 @@ def expect_dict(self, got_dict, expected_dict):
|
||||
|
||||
def sanitize_got_info_dict(got_dict):
|
||||
IGNORED_FIELDS = (
|
||||
*YoutubeDL._format_fields,
|
||||
# Format keys
|
||||
'url', 'manifest_url', 'format', 'format_id', 'format_note', 'width', 'height', 'resolution',
|
||||
'dynamic_range', 'tbr', 'abr', 'acodec', 'asr', 'vbr', 'fps', 'vcodec', 'container', 'filesize',
|
||||
'filesize_approx', 'player_url', 'protocol', 'fragment_base_url', 'fragments', 'preference',
|
||||
'language', 'language_preference', 'quality', 'source_preference', 'http_headers',
|
||||
'stretched_ratio', 'no_resume', 'has_drm', 'downloader_options',
|
||||
|
||||
# RTMP formats
|
||||
'page_url', 'app', 'play_path', 'tc_url', 'flash_version', 'rtmp_live', 'rtmp_conn', 'rtmp_protocol', 'rtmp_real_time',
|
||||
|
||||
# Lists
|
||||
'formats', 'thumbnails', 'subtitles', 'automatic_captions', 'comments', 'entries',
|
||||
@@ -204,7 +220,7 @@ def sanitize_got_info_dict(got_dict):
|
||||
IGNORED_PREFIXES = ('', 'playlist', 'requested', 'webpage')
|
||||
|
||||
def sanitize(key, value):
|
||||
if isinstance(value, str) and len(value) > 100 and key != 'thumbnail':
|
||||
if isinstance(value, str) and len(value) > 100:
|
||||
return f'md5:{md5(value)}'
|
||||
elif isinstance(value, list) and len(value) > 10:
|
||||
return f'count:{len(value)}'
|
||||
@@ -222,10 +238,6 @@ def sanitize(key, value):
|
||||
if test_info_dict.get('display_id') == test_info_dict.get('id'):
|
||||
test_info_dict.pop('display_id')
|
||||
|
||||
# Check url for flat entries
|
||||
if got_dict.get('_type', 'video') != 'video' and got_dict.get('url'):
|
||||
test_info_dict['url'] = got_dict['url']
|
||||
|
||||
return test_info_dict
|
||||
|
||||
|
||||
@@ -239,31 +251,33 @@ def expect_info_dict(self, got_dict, expected_dict):
|
||||
for key in mandatory_fields:
|
||||
self.assertTrue(got_dict.get(key), 'Missing mandatory field %s' % key)
|
||||
# Check for mandatory fields that are automatically set by YoutubeDL
|
||||
if got_dict.get('_type', 'video') == 'video':
|
||||
for key in ['webpage_url', 'extractor', 'extractor_key']:
|
||||
self.assertTrue(got_dict.get(key), 'Missing field: %s' % key)
|
||||
for key in ['webpage_url', 'extractor', 'extractor_key']:
|
||||
self.assertTrue(got_dict.get(key), 'Missing field: %s' % key)
|
||||
|
||||
test_info_dict = sanitize_got_info_dict(got_dict)
|
||||
|
||||
missing_keys = set(test_info_dict.keys()) - set(expected_dict.keys())
|
||||
if missing_keys:
|
||||
def _repr(v):
|
||||
if isinstance(v, str):
|
||||
if isinstance(v, compat_str):
|
||||
return "'%s'" % v.replace('\\', '\\\\').replace("'", "\\'").replace('\n', '\\n')
|
||||
elif isinstance(v, type):
|
||||
return v.__name__
|
||||
else:
|
||||
return repr(v)
|
||||
info_dict_str = ''.join(
|
||||
f' {_repr(k)}: {_repr(v)},\n'
|
||||
for k, v in test_info_dict.items() if k not in missing_keys)
|
||||
if info_dict_str:
|
||||
info_dict_str += '\n'
|
||||
info_dict_str = ''
|
||||
if len(missing_keys) != len(expected_dict):
|
||||
info_dict_str += ''.join(
|
||||
' %s: %s,\n' % (_repr(k), _repr(v))
|
||||
for k, v in test_info_dict.items() if k not in missing_keys)
|
||||
|
||||
if info_dict_str:
|
||||
info_dict_str += '\n'
|
||||
info_dict_str += ''.join(
|
||||
f' {_repr(k)}: {_repr(test_info_dict[k])},\n'
|
||||
' %s: %s,\n' % (_repr(k), _repr(test_info_dict[k]))
|
||||
for k in missing_keys)
|
||||
info_dict_str = '\n\'info_dict\': {\n' + info_dict_str + '},\n'
|
||||
write_string(info_dict_str.replace('\n', '\n '), out=sys.stderr)
|
||||
write_string(
|
||||
'\n\'info_dict\': {\n' + info_dict_str + '},\n', out=sys.stderr)
|
||||
self.assertFalse(
|
||||
missing_keys,
|
||||
'Missing keys in test definition: %s' % (
|
||||
@@ -289,30 +303,30 @@ def assertRegexpMatches(self, text, regexp, msg=None):
|
||||
def assertGreaterEqual(self, got, expected, msg=None):
|
||||
if not (got >= expected):
|
||||
if msg is None:
|
||||
msg = f'{got!r} not greater than or equal to {expected!r}'
|
||||
msg = '%r not greater than or equal to %r' % (got, expected)
|
||||
self.assertTrue(got >= expected, msg)
|
||||
|
||||
|
||||
def assertLessEqual(self, got, expected, msg=None):
|
||||
if not (got <= expected):
|
||||
if msg is None:
|
||||
msg = f'{got!r} not less than or equal to {expected!r}'
|
||||
msg = '%r not less than or equal to %r' % (got, expected)
|
||||
self.assertTrue(got <= expected, msg)
|
||||
|
||||
|
||||
def assertEqual(self, got, expected, msg=None):
|
||||
if not (got == expected):
|
||||
if msg is None:
|
||||
msg = f'{got!r} not equal to {expected!r}'
|
||||
msg = '%r not equal to %r' % (got, expected)
|
||||
self.assertTrue(got == expected, msg)
|
||||
|
||||
|
||||
def expect_warnings(ydl, warnings_re):
|
||||
real_warning = ydl.report_warning
|
||||
|
||||
def _report_warning(w, *args, **kwargs):
|
||||
def _report_warning(w):
|
||||
if not any(re.search(w_re, w) for w_re in warnings_re):
|
||||
real_warning(w, *args, **kwargs)
|
||||
real_warning(w)
|
||||
|
||||
ydl.report_warning = _report_warning
|
||||
|
||||
|
||||
@@ -44,6 +44,5 @@
|
||||
"writesubtitles": false,
|
||||
"allsubtitles": false,
|
||||
"listsubtitles": false,
|
||||
"fixup": "never",
|
||||
"allow_playlist_files": false
|
||||
"fixup": "never"
|
||||
}
|
||||
|
||||
1
test/swftests.unused/.gitignore
vendored
Normal file
1
test/swftests.unused/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
*.swf
|
||||
19
test/swftests.unused/ArrayAccess.as
Normal file
19
test/swftests.unused/ArrayAccess.as
Normal file
@@ -0,0 +1,19 @@
|
||||
// input: [["a", "b", "c", "d"]]
|
||||
// output: ["c", "b", "a", "d"]
|
||||
|
||||
package {
|
||||
public class ArrayAccess {
|
||||
public static function main(ar:Array):Array {
|
||||
var aa:ArrayAccess = new ArrayAccess();
|
||||
return aa.f(ar, 2);
|
||||
}
|
||||
|
||||
private function f(ar:Array, num:Number):Array{
|
||||
var x:String = ar[0];
|
||||
var y:String = ar[num % ar.length];
|
||||
ar[0] = y;
|
||||
ar[num] = x;
|
||||
return ar;
|
||||
}
|
||||
}
|
||||
}
|
||||
17
test/swftests.unused/ClassCall.as
Normal file
17
test/swftests.unused/ClassCall.as
Normal file
@@ -0,0 +1,17 @@
|
||||
// input: []
|
||||
// output: 121
|
||||
|
||||
package {
|
||||
public class ClassCall {
|
||||
public static function main():int{
|
||||
var f:OtherClass = new OtherClass();
|
||||
return f.func(100,20);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class OtherClass {
|
||||
public function func(x: int, y: int):int {
|
||||
return x+y+1;
|
||||
}
|
||||
}
|
||||
15
test/swftests.unused/ClassConstruction.as
Normal file
15
test/swftests.unused/ClassConstruction.as
Normal file
@@ -0,0 +1,15 @@
|
||||
// input: []
|
||||
// output: 0
|
||||
|
||||
package {
|
||||
public class ClassConstruction {
|
||||
public static function main():int{
|
||||
var f:Foo = new Foo();
|
||||
return 0;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class Foo {
|
||||
|
||||
}
|
||||
18
test/swftests.unused/ConstArrayAccess.as
Normal file
18
test/swftests.unused/ConstArrayAccess.as
Normal file
@@ -0,0 +1,18 @@
|
||||
// input: []
|
||||
// output: 4
|
||||
|
||||
package {
|
||||
public class ConstArrayAccess {
|
||||
private static const x:int = 2;
|
||||
private static const ar:Array = ["42", "3411"];
|
||||
|
||||
public static function main():int{
|
||||
var c:ConstArrayAccess = new ConstArrayAccess();
|
||||
return c.f();
|
||||
}
|
||||
|
||||
public function f(): int {
|
||||
return ar[1].length;
|
||||
}
|
||||
}
|
||||
}
|
||||
12
test/swftests.unused/ConstantInt.as
Normal file
12
test/swftests.unused/ConstantInt.as
Normal file
@@ -0,0 +1,12 @@
|
||||
// input: []
|
||||
// output: 2
|
||||
|
||||
package {
|
||||
public class ConstantInt {
|
||||
private static const x:int = 2;
|
||||
|
||||
public static function main():int{
|
||||
return x;
|
||||
}
|
||||
}
|
||||
}
|
||||
10
test/swftests.unused/DictCall.as
Normal file
10
test/swftests.unused/DictCall.as
Normal file
@@ -0,0 +1,10 @@
|
||||
// input: [{"x": 1, "y": 2}]
|
||||
// output: 3
|
||||
|
||||
package {
|
||||
public class DictCall {
|
||||
public static function main(d:Object):int{
|
||||
return d.x + d.y;
|
||||
}
|
||||
}
|
||||
}
|
||||
10
test/swftests.unused/EqualsOperator.as
Normal file
10
test/swftests.unused/EqualsOperator.as
Normal file
@@ -0,0 +1,10 @@
|
||||
// input: []
|
||||
// output: false
|
||||
|
||||
package {
|
||||
public class EqualsOperator {
|
||||
public static function main():Boolean{
|
||||
return 1 == 2;
|
||||
}
|
||||
}
|
||||
}
|
||||
13
test/swftests.unused/LocalVars.as
Normal file
13
test/swftests.unused/LocalVars.as
Normal file
@@ -0,0 +1,13 @@
|
||||
// input: [1, 2]
|
||||
// output: 3
|
||||
|
||||
package {
|
||||
public class LocalVars {
|
||||
public static function main(a:int, b:int):int{
|
||||
var c:int = a + b + b;
|
||||
var d:int = c - b;
|
||||
var e:int = d;
|
||||
return e;
|
||||
}
|
||||
}
|
||||
}
|
||||
22
test/swftests.unused/MemberAssignment.as
Normal file
22
test/swftests.unused/MemberAssignment.as
Normal file
@@ -0,0 +1,22 @@
|
||||
// input: [1]
|
||||
// output: 2
|
||||
|
||||
package {
|
||||
public class MemberAssignment {
|
||||
public var v:int;
|
||||
|
||||
public function g():int {
|
||||
return this.v;
|
||||
}
|
||||
|
||||
public function f(a:int):int{
|
||||
this.v = a;
|
||||
return this.v + this.g();
|
||||
}
|
||||
|
||||
public static function main(a:int): int {
|
||||
var v:MemberAssignment = new MemberAssignment();
|
||||
return v.f(a);
|
||||
}
|
||||
}
|
||||
}
|
||||
24
test/swftests.unused/NeOperator.as
Normal file
24
test/swftests.unused/NeOperator.as
Normal file
@@ -0,0 +1,24 @@
|
||||
// input: []
|
||||
// output: 123
|
||||
|
||||
package {
|
||||
public class NeOperator {
|
||||
public static function main(): int {
|
||||
var res:int = 0;
|
||||
if (1 != 2) {
|
||||
res += 3;
|
||||
} else {
|
||||
res += 4;
|
||||
}
|
||||
if (2 != 2) {
|
||||
res += 10;
|
||||
} else {
|
||||
res += 20;
|
||||
}
|
||||
if (9 == 9) {
|
||||
res += 100;
|
||||
}
|
||||
return res;
|
||||
}
|
||||
}
|
||||
}
|
||||
21
test/swftests.unused/PrivateCall.as
Normal file
21
test/swftests.unused/PrivateCall.as
Normal file
@@ -0,0 +1,21 @@
|
||||
// input: []
|
||||
// output: 9
|
||||
|
||||
package {
|
||||
public class PrivateCall {
|
||||
public static function main():int{
|
||||
var f:OtherClass = new OtherClass();
|
||||
return f.func();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class OtherClass {
|
||||
private function pf():int {
|
||||
return 9;
|
||||
}
|
||||
|
||||
public function func():int {
|
||||
return this.pf();
|
||||
}
|
||||
}
|
||||
22
test/swftests.unused/PrivateVoidCall.as
Normal file
22
test/swftests.unused/PrivateVoidCall.as
Normal file
@@ -0,0 +1,22 @@
|
||||
// input: []
|
||||
// output: 9
|
||||
|
||||
package {
|
||||
public class PrivateVoidCall {
|
||||
public static function main():int{
|
||||
var f:OtherClass = new OtherClass();
|
||||
f.func();
|
||||
return 9;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
class OtherClass {
|
||||
private function pf():void {
|
||||
;
|
||||
}
|
||||
|
||||
public function func():void {
|
||||
this.pf();
|
||||
}
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user