iBug
iBug
I just stumbled upon this article. Maybe it's what @makyen was referring to? [Stealing arbitrary GitHub Actions secrets - Teddy Katz's Blog](https://blog.teddykatz.com/2021/03/17/github-actions-write-access.html) However, as indicated in the blog post itself,...
@ArtOfCode- Referencing [angussidney](https://chat.stackexchange.com/transcript/message/51108310#51108310), we're doing 10k API calls each day, and each API call writes the file once, so 2 MiB * 10k getting 20 GiB is rather accurate.
We should probably still link to the spec pages somewhere so that they remain accessible. Linking it from `README.md` using relative links won't work very well because the spec pages...
I'm inclined to preserve the pages *on the website* because of the use of theme-specific markup. Also note that some content in the specs relies on such markup (e.g. [this][1])....
I wonder if a "Spec landing page" is really necessary. Considering that it's unlikely that we add a 3rd page later on, it may be simpler to just link to...
@Tiny-Giant What if the addition is only regex/replacement pairs, with no functions involved? I think regex can be stored as text and later processed so this shouldn't be a problem...
你可以把你想黑白名单的域名放进一个 [Set](https://developer.mozilla.org/zh-CN/docs/Web/JavaScript/Reference/Global_Objects/Set) 里面,然后在 `function FindProxyForURL` 最前面插入判断并直接返回 `proxy` 或 `direct`。
现在的脚本里的 `BLACKPAT` 是一个数组,数组里每一项是一个通配符表达式(见 [`shExpMatch()`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Proxy_servers_and_tunneling/Proxy_Auto-Configuration_(PAC)_file#shExpMatch)),请确保你的 JavaScript 语法是正确的,否则整个 PAC 文件会解析失败。
`BLACKPAT` 和 `WHITEPAT` 是匹配整个 URL 的,如果想仅靠域名匹配的话稍微麻烦点,我把域名按 zone 重新组合成了前面的 `DOMAINS` 变量,格式如下: ```javascript var DOMAINS = { "com": { "example": 0, "google": { "@": 0, "mtalk": 1 } } }; ``` 搜索的时候从右往左,例如...
前排通知 3:[Releases](https://github.com/iBug/pac/releases/latest) 每周六晚上 8 点(北京时间)自动冒出新的 Release,包含从数据源新生成的 pac 脚本。 前排通知 2:[`dist` 分支](https://github.com/iBug/pac/tree/dist)每周六晚上 8 点(北京时间)自动冒出新的 pac 脚本。 **Deprecated** 从开源项目的内容完整性角度考虑,我现在觉得另外搞个生成器放在自己个人网站上不太好,把代码和资源拆太散了。另一个问题是目前已有的两个数据源更新频率都很低,弄一个网页版实时生成的工具有点多余,使用 GitHub Actions 每周自动更新已经足够了。 前排通知:我重新做了一个[网页版的生成器](https://ibugone.com/project/pac-generator),取代本仓库。可以直接用浏览器打开,直接获取最新版的 IP 地址列表,生成 PAC 并下载到本地,方便又快捷。 把更新过的IP地址段列表保存为 `ip_ranges.txt`,然后运行 Python 脚本就可以得到输出。 如果你需要自己更新...