Skip to content

Commit

Permalink
Merge pull request #515 from Guovin/dev
Browse files Browse the repository at this point in the history
Release:v1.5.1
  • Loading branch information
Guovin authored Nov 5, 2024
2 parents 4777370 + 9ff6819 commit 650db8f
Show file tree
Hide file tree
Showing 17 changed files with 99 additions and 40 deletions.
34 changes: 34 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,39 @@
# 更新日志(Changelog)

## v1.5.1

### 2024/11/5

- ✨ 新增频道接口白名单:不参与测速,永远保留在结果最前面。使用:模板频道接口地址后添加$!即可实现(如:广东珠江,http://xxx.m3u$!),额外信息补充(如:广东珠江,http://xxx.m3u$!额外信息)(#470),更多接口白名单请至https://github.com/Guovin/TV/issues/514讨论
- ✨ 新增 🈳 无结果频道分类:无结果频道默认归类至该底部分类下(#473
- ✨ 接口地址增加来源类型说明
- ✨ 默认模板增加广东民生(#481)、广州综合(#504
- 🪄 优化偏好结果输出
- 🪄 重构配置读取与增加全局常量
- 🐛 修复部分接口匹配失败问题
- 🐛 修复更新结果为空等问题(#464#467
- 🐛 修复接口地址复制空格问题(#472 by:@haohaitao
- 🐛 修复结果日志 unpack error
- 🐛 修复结果接口信息为空问题(#505
- 🗑️ 移除仓库根目录 txt 结果文件,请至 output 目录下查看结果文件

<details>
<summary>English</summary>

- ✨ 新增频道接口白名单:不参与测速,永远保留在结果最前面。使用:模板频道接口地址后添加$!即可实现(如:广东珠江,http://xxx.m3u$!),额外信息补充(如:广东珠江,http://xxx.m3u$!额外信息)(#470),更多接口白名单请至https://github.com/Guovin/TV/issues/514讨论
- ✨ 新增 🈳 无结果频道分类:无结果频道默认归类至该底部分类下(#473
- ✨ 接口地址增加来源类型说明
- ✨ 默认模板增加广东民生(#481)、广州综合(#504
- 🪄 优化偏好结果输出
- 🪄 重构配置读取与增加全局常量
- 🐛 修复部分接口匹配失败问题
- 🐛 修复更新结果为空等问题(#464#467
- 🐛 修复接口地址复制空格问题(#472 by:@haohaitao
- 🐛 修复结果日志 unpack error
- 🐛 修复结果接口信息为空问题(#505
- 🗑️ 移除仓库根目录 txt 结果文件,请至 output 目录下查看结果文件
</details>

## v1.5.0

### 2024/10/25
Expand Down
10 changes: 8 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
</div>
<br>
<div>
🏠广东频道: 广东珠江, 广东体育, 广东新闻, 广东卫视, 大湾区卫视, 广州影视, 广州竞赛, 江门综合, 江门侨乡生活, 佛山综合, 深圳卫视, 汕头综合, 汕头经济, 汕头文旅, 茂名综合, 茂名公共
🏠广东频道: 广东珠江, 广东体育, 广东新闻, 广东民生, 广东卫视, 大湾区卫视, 广州综合, 广州影视, 广州竞赛, 江门综合, 江门侨乡生活, 佛山综合, 深圳卫视, 汕头综合, 汕头经济, 汕头文旅, 茂名综合, 茂名公共
</div>
<br>
<div>
Expand Down Expand Up @@ -117,8 +117,14 @@ Fork 本项目并开启工作流更新,具体步骤请见[详细教程](./docs
### 方式二:命令行

```python
pip3 install pipenv
pip install pipenv
```

```python
pipenv install
```

```python
pipenv run build
```

Expand Down
10 changes: 8 additions & 2 deletions README_en.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
</div>
<br>
<div>
🏠Guangdong Channel: 广东珠江, 广东体育, 广东新闻, 广东卫视, 大湾区卫视, 广州影视, 广州竞赛, 江门综合, 江门侨乡生活, 佛山综合, 深圳卫视, 汕头综合, 汕头经济, 汕头文旅, 茂名综合, 茂名公共
🏠Guangdong Channel: 广东珠江, 广东体育, 广东新闻, 广东民生, 广东卫视, 大湾区卫视, 广州综合, 广州影视, 广州竞赛, 江门综合, 江门侨乡生活, 佛山综合, 深圳卫视, 汕头综合, 汕头经济, 汕头文旅, 茂名综合, 茂名公共
</div>
<br>
<div>
Expand Down Expand Up @@ -117,8 +117,14 @@ Fork this project and initiate workflow updates, detailed steps are available at
### Method 2: Command Line

```python
pip3 install pipenv
pip install pipenv
```

```python
pipenv install
```

```python
pipenv run build
```

Expand Down
2 changes: 1 addition & 1 deletion docs/tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ https://mirror.ghproxy.com/raw.githubusercontent.com/您的github用户名/仓

2. 运行更新
项目目录下打开终端 CMD 依次运行以下命令:
pip3 install pipenv
pip install pipenv
pipenv install
pipenv run build
```
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorial_en.md
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,7 @@ Please download and install Python from the official site. During installation,

2. Run Update
Open a CMD terminal in the project directory and run the following commands in sequence:
pip3 install pipenv
pip install pipenv
pipenv install
pipenv run build
```
Expand Down
Binary file added updates/fofa/fofa_hotel_region_result.pkl
Binary file not shown.
16 changes: 11 additions & 5 deletions updates/fofa/request.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
from driver.setup import setup_driver
import re
from utils.config import config
import utils.constants as constants
from utils.retry import retry_func
from utils.channel import format_channel_name
from utils.tools import merge_objects, get_pbar_remaining, add_url_info, resource_path
Expand Down Expand Up @@ -91,9 +92,10 @@ async def get_channels_by_fofa(urls=None, multicast=False, callback=None):
test_url = fofa_urls[0][0]
proxy = await get_proxy(test_url, best=True, with_test=True)
cancel_event = threading.Event()
hotel_name = constants.origin_map["hotel"]

def process_fofa_channels(fofa_info):
nonlocal proxy, fofa_urls_len, open_driver, open_sort, cancel_event
nonlocal proxy
if cancel_event.is_set():
return {}
fofa_url = fofa_info[0]
Expand Down Expand Up @@ -130,7 +132,11 @@ def process_fofa_channels(fofa_info):
with ThreadPoolExecutor(max_workers=100) as executor:
futures = [
executor.submit(
process_fofa_json_url, url, fofa_info[1], open_sort
process_fofa_json_url,
url,
fofa_info[1],
open_sort,
hotel_name,
)
for url in urls
]
Expand Down Expand Up @@ -184,7 +190,7 @@ def process_fofa_channels(fofa_info):
return fofa_results


def process_fofa_json_url(url, region, open_sort):
def process_fofa_json_url(url, region, open_sort, hotel_name="酒店源"):
"""
Process the FOFA json url
"""
Expand All @@ -208,11 +214,11 @@ def process_fofa_json_url(url, region, open_sort):
total_url = (
add_url_info(
f"{url}{item_url}",
f"{region}酒店源|cache:{url}",
f"{region}{hotel_name}|cache:{url}",
)
if open_sort
else add_url_info(
f"{url}{item_url}", f"{region}酒店源"
f"{url}{item_url}", f"{region}{hotel_name}"
)
)
if item_name not in channels:
Expand Down
2 changes: 1 addition & 1 deletion updates/hotel/request.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ async def get_channels_by_hotel(callback=None):
start_time = time()

def process_region_by_hotel(region):
nonlocal proxy, open_driver, page_num
nonlocal proxy
name = f"{region}"
info_list = []
driver = None
Expand Down
2 changes: 1 addition & 1 deletion updates/multicast/request.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ async def get_channels_by_multicast(names, callback=None):
merge_objects(search_region_type_result, fofa_result)

def process_channel_by_multicast(region, type):
nonlocal proxy, open_driver, page_num, start_time
nonlocal proxy
name = f"{region}{type}"
info_list = []
driver = None
Expand Down
6 changes: 5 additions & 1 deletion updates/online_search/request.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from asyncio import create_task, gather
from utils.config import config
import utils.constants as constants
from utils.speed import get_speed
from utils.channel import (
format_channel_name,
Expand All @@ -11,6 +12,7 @@
get_pbar_remaining,
get_soup,
format_url_with_cache,
add_url_info,
)
from updates.proxy import get_proxy, get_proxy_next
from time import time
Expand Down Expand Up @@ -61,9 +63,10 @@ async def get_channels_by_online_search(names, callback=None):
if open_proxy:
proxy = await get_proxy(pageUrl, best=True, with_test=True)
start_time = time()
online_search_name = constants.origin_map["online_search"]

def process_channel_by_online_search(name):
nonlocal proxy, open_proxy, open_driver, page_num
nonlocal proxy
info_list = []
driver = None
try:
Expand Down Expand Up @@ -166,6 +169,7 @@ def process_channel_by_online_search(name):
for result in results:
url, date, resolution = result
if url and check_url_by_patterns(url):
url = add_url_info(url, online_search_name)
url = format_url_with_cache(url)
info_list.append((url, date, resolution))
break
Expand Down
1 change: 0 additions & 1 deletion updates/proxy/request.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@ def get_proxy_list(page_count=1):
pbar = tqdm(total=len(urls), desc="Getting proxy list")

def get_proxy(url):
nonlocal open_driver
proxys = []
try:
if open_driver:
Expand Down
12 changes: 10 additions & 2 deletions updates/subscribe/request.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
from time import time
from requests import Session, exceptions
from utils.config import config
import utils.constants as constants
from utils.retry import retry_func
from utils.channel import get_name_url, format_channel_name
from utils.tools import (
Expand Down Expand Up @@ -40,6 +41,9 @@ async def get_channels_by_subscribe_urls(
0,
)
session = Session()
hotel_name = constants.origin_map["hotel"]
multicast_name = constants.origin_map["multicast"]
subscribe_name = constants.origin_map["subscribe"]

def process_subscribe_channels(subscribe_info):
if (multicast or hotel) and isinstance(subscribe_info, dict):
Expand Down Expand Up @@ -83,9 +87,13 @@ def process_subscribe_channels(subscribe_info):
url = url.partition("$")[0]
if not multicast:
info = (
f"{region}酒店源"
f"{region}{hotel_name}"
if hotel
else "组播源" if "/rtp/" in url else "订阅源"
else (
f"{multicast_name}"
if "/rtp/" in url
else f"{subscribe_name}"
)
)
url = add_url_info(url, info)
url = format_url_with_cache(
Expand Down
23 changes: 4 additions & 19 deletions utils/channel.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@
add_url_info,
remove_cache_info,
resource_path,
get_resolution_value,
)
from utils.speed import (
sort_urls_by_speed_and_resolution,
Expand Down Expand Up @@ -253,17 +252,19 @@ def get_channel_multicast_result(result, search_result):
Get the channel multicast info result by result and search result
"""
info_result = {}
multicast_name = constants.origin_map["multicast"]
for name, result_obj in result.items():
info_list = [
(
(
add_url_info(
f"http://{url}/rtp/{ip}",
f"{result_region}{result_type}组播源|cache:{url}",
f"{result_region}{result_type}{multicast_name}|cache:{url}",
)
if config.open_sort
else add_url_info(
f"http://{url}/rtp/{ip}", f"{result_region}{result_type}组播源"
f"http://{url}/rtp/{ip}",
f"{result_region}{result_type}{multicast_name}",
)
),
date,
Expand Down Expand Up @@ -614,8 +615,6 @@ async def sort_channel_list(
semaphore,
ffmpeg=False,
ipv6_proxy=None,
filter_resolution=False,
min_resolution=None,
callback=None,
):
"""
Expand All @@ -630,10 +629,6 @@ async def sort_channel_list(
)
if sorted_data:
for (url, date, resolution, origin), response_time in sorted_data:
if resolution and filter_resolution:
resolution_value = get_resolution_value(resolution)
if resolution_value < min_resolution:
continue
logging.info(
f"Name: {name}, URL: {url}, Date: {date}, Resolution: {resolution}, Response Time: {response_time} ms"
)
Expand Down Expand Up @@ -670,8 +665,6 @@ async def process_sort_channel_list(data, ipv6=False, callback=None):
semaphore,
ffmpeg=is_ffmpeg,
ipv6_proxy=ipv6_proxy,
filter_resolution=config.open_filter_resolution,
min_resolution=config.min_resolution_value,
callback=callback,
)
)
Expand Down Expand Up @@ -718,12 +711,6 @@ async def process_sort_channel_list(data, ipv6=False, callback=None):
continue
response_time, resolution = cache
if response_time and response_time != float("inf"):
if resolution:
if config.open_filter_resolution:
resolution_value = get_resolution_value(resolution)
if resolution_value < config.min_resolution_value:
continue
url = add_url_info(url, resolution)
append_data_to_info_data(
sort_data,
cate,
Expand Down Expand Up @@ -845,6 +832,4 @@ def format_channel_url_info(data):
for url_info in obj.values():
for i, (url, date, resolution, origin) in enumerate(url_info):
url = remove_cache_info(url)
if resolution:
url = add_url_info(url, resolution)
url_info[i] = (url, date, resolution, origin)
7 changes: 7 additions & 0 deletions utils/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,3 +51,10 @@
"CCTV17农业农村": "CCTV17",
"CCTV17农业": "CCTV17",
}

origin_map = {
"hotel": "酒店源",
"multicast": "组播源",
"subscribe": "订阅源",
"online_search": "关键字源",
}
2 changes: 0 additions & 2 deletions utils/speed.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,8 +103,6 @@ async def check_stream_speed(url_info):
frame, resolution = get_video_info(video_info)
if frame is None or frame == float("inf"):
return float("inf")
if resolution:
url_info[0] = add_url_info(url, resolution)
url_info[2] = resolution
return (url_info, frame)
except Exception as e:
Expand Down
8 changes: 7 additions & 1 deletion utils/tools.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
from urllib.parse import urlparse
import socket
from utils.config import config
import utils.constants as constants
import re
from bs4 import BeautifulSoup
from flask import render_template_string, send_file
Expand Down Expand Up @@ -158,12 +159,17 @@ def get_total_urls_from_info_list(infoList, ipv6=False):

pure_url, _, info = url.partition("$")
if not info:
url = add_url_info(pure_url, origin)
origin_name = constants.origin_map[origin]
if origin_name:
url = add_url_info(pure_url, origin_name)

url_is_ipv6 = is_ipv6(url)
if url_is_ipv6:
url = add_url_info(url, "IPv6")

if resolution:
url = add_url_info(url, resolution)

if url_is_ipv6:
categorized_urls[origin]["ipv6"].append(url)
else:
Expand Down
2 changes: 1 addition & 1 deletion version.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
{
"version": "1.5.0",
"version": "1.5.1",
"name": "IPTV电视直播源更新工具"
}

0 comments on commit 650db8f

Please sign in to comment.