提示:文章写完后,目录可以自动生成,如何生成可参考右边的帮助文档
目录
前言
一、网页数据获取流程
1.点开平台的交易中心
2.点击任意一个招标公告
二、网页数据分析
1.招标公告信息获取
1.1招标公告来源分析
1.2开发者工具获取xhr
1.3解密js代码定位
1.4只能用一次的下载招标公告信息方式
1.4.1扣代码:
1.4.2只能用一次的完整下载流程
1.5如何实现批量获取招标公告信息
1.5.1观察哪里有变化
1.5.2找出生成ts还有portal-sign参数的js代码
1.5.3扣代码
1.5.4异步爬虫批量获取tradeinfo
总结
前言
这段时间上听了何老师的免费课程,对于如何高效爬取福建省公共资源交易电子公共服务平台数据非常感兴趣,
但是何老师的课程其实有很多都没讲到,实际上只听何老师讲的对于福建省公共资源交易电子公共服务平台数据的爬取是完全不够我去批量爬取该平台的数据的,那么我该如何去批量爬取这些数据呢?
这篇文章仅有这么批量获取招标公告信息的内容,怎么下载招标公告内容在后续文章中:
批量爬取福建省公共资源交易电子公共服务平台数据2-CSDN博客
两篇文章合起来就可以实现对福建省公共资源交易电子公共服务平台数据的批量爬取了
一、网页数据获取流程
1.点开平台的交易中心
2.点击任意一个招标公告
上述步骤是人来操作即可的,如何上机器呢?
二、网页数据分析
1.招标公告信息获取
1.1招标公告来源分析
查看源代码,发现招标公告不在网页源代码中,即招标公告信息不是静态请求获取的,所以可以判断不是js直接生成的就是异步xml获取的
1.2开发者工具获取xhr
随意点击招标公告下面的页码,打开开发者工具,点击网络->xhr/fetch,可以看到一个孤零零的tradeinfo映入眼帘
可以判断这玩意的data值大概率就是我们想要的招标公告信息了
1.3解密js代码定位
判断是否为标准加解密方式:
可以看到发起程序没有混淆,所以可以尝试一下直接定位加解密的位置
按照何老师的方法,点击
打开搜索栏:
在下方搜索栏位置输入‘decrypt(’进行定位
信息有很多,但是有用的就是最上面的那一个,点击它定位到js代码:
它之所有有用,是因为这个地方的用的是一个标准的函数传参方式,而下面的很多都是方法名定义,类等等其它的内容,这不符合我们加解密的一个过程:
加密:输入明文->加密函数(加密方法)->输出密文
解密:输入密文->解密函数(解密方法)->输出明文
我们可以给return a.toString(h.a.enc.Utf8)打上断点,再次点击一个新的页码,浏览器进入调试模式
进入控制台输出a.toString(h.a.enc.Utf8)
可以看出这就是我们想要的招标公告内容了
1.4只能用一次的下载招标公告信息方式
1.4.1扣代码:
function b(t) {
var e = h.a.enc.Utf8.parse(r["e"])
, n = h.a.enc.Utf8.parse(r["i"])
, a = h.a.AES.decrypt(t, e, {
iv: n,
mode: h.a.mode.CBC,
padding: h.a.pad.Pkcs7
});
return a.toString(h.a.enc.Utf8)
}
可以看到decrypt有好几个参数,其中mode=cbc, padding = pkcs7,基本可以判断这个解密方式就是标准的aes解密
我们试着把前面的实例名都改成标准库的名称:
const cryptojs = require('crypto-js')
function b(t) {
var e = cryptojs.enc.Utf8.parse(r["e"])
, n = cryptojs.enc.Utf8.parse(人["i"])
, a = cryptojs.AES.decrypt(t, e, {
iv: n,
mode: cryptojs.mode.CBC,
padding: cryptojs.pad.Pkcs7
});
return a.toString(cryptojs.enc.Utf8)
}
发现r["e"],r["i"]未知,去控制台输出一下:
替换:
const cryptojs = require('crypto-js')
function b(t) {
var e = cryptojs.enc.Utf8.parse('EB444973714E4A40876CE66BE45D5930')
, n = cryptojs.enc.Utf8.parse('B5A8904209931867')
, a = cryptojs.AES.decrypt(t, e, {
iv: n,
mode: cryptojs.mode.CBC,
padding: cryptojs.pad.Pkcs7
});
return a.toString(cryptojs.enc.Utf8)
}
在webstorm内运行一下:
信息出来了!!
1.4.2只能用一次的完整下载流程
复制tradeinfo的curl(bash),去Convert curl commands to code (curlconverter.com):
复制以上代码入pycharm,然后根据何老师的步骤,导入decrypt函数js文件,在pycharm里面执行它,就可以只一次获取到招标公告信息
1.5如何实现批量获取招标公告信息
单单修改pageNo,你再去爬取会报错,它返回的tradeinfo没有data,msg提升是发生未知错误
这时候我就很纳闷了,那这怎么办呀,难不成我每次爬取都要把上述步骤做一遍,如何一百页我就做一百页上述步骤吗?
不行,我才不这么做呢,那完全违反了我用爬虫爬取数据的初衷了。
那么如何做呢?
接下来的步骤是我自己做的,有点乱可能,但是肯定可以爬取
1.5.1观察哪里有变化
经过我的一番点击搜索,发现了tradeinfo请求有两个地方是必须要弄出来的:
一个是请求头的portal-sign,一个是负载(payload)的ts
那么我们如何整出合理的ts还有合理的portal-sign呢?
1.5.2找出生成ts还有portal-sign参数的js代码
同理,我去搜索栏中搜索ts关键字,好家伙,这个关键字太泛了,谁都能来参上一脚
那还是搜portal-sign吧,这么特立独行这玩意应该没什么问题了吧
点击跳转进去
好家伙,我看不懂呀!!
那就格式化看看:
我测试过了,前面那堆是m.interceptors.request.use没用的玩意,我直接给function(t)命名为get_portal_sign,然后就是下面那个乱的可以的return把它整理了一下,嘿,这下清晰多了
function get_portal_sign() {
t.headers.baseURL && (t.baseURL = t.headers.baseURL);
var e = Object.assign({}, t.params);
return e["ts"] = (new Date).getTime(), "string" === typeof t.data && (t.data = JSON.parse(t.data)),
"post" === t.method && t.data && Object.assign(e, t.data),
t.headers["portal-sign"] = d(e),
"post" === t.method ? t.data = g(g({}, t.data), {}, {ts: e["ts"]}) : "get" === t.method && (t.params = g(g({}, t.params), {}, {ts: e["ts"]})), t
}
惊喜呀,我们可以看到这个e["ts"]的存在,在控制台打印输出发现,我靠就是我们payload输出的那个ts呀!!
合着这个玩意其实也没啥,关键是这个t是个啥玩意呀?
我们去控制台输出看看:
看不懂,没事,复制到pycharm里看看:
我点了很多次别的页码后发现,t其实就一个变化的,那就是pageNo,total在其它时候点也会有变化,但是如果你不关闭平台的网页,那么它就不会发生变化
那这不就简单起来了
1.5.3扣代码
具体的扣代码流程我就不说了,总之就是一顿跳转,中间遇到了一个标准md5加密,直接导入一个md5加密的库,规定32位加密的就可以了,下面放出扣出的所有代码:
var t = {
"url": "/Trade/TradeInfo",
"method": "post",
"data": {
"pageNo": null,
"pageSize": 20,
"total": 3688,
"AREACODE": "",
"M_PROJECT_TYPE": "",
"KIND": "GCJS",
"GGTYPE": "1",
"PROTYPE": "",
"timeType": "6",
"BeginTime": "2023-09-10 00:00:00",
"EndTime": "2024-03-10 23:59:59",
"createTime": []
},
"headers": {
"common": {
"Accept": "application/json, text/plain, */*"
}, "delete": {}, "get": {}, "head": {}, "post": {
"Content-Type": "application/x-www-form-urlencoded"
}, "put": {
"Content-Type": "application/x-www-form-urlencoded"
}, "patch": {
"Content-Type": "application/x-www-form-urlencoded"
}, "content-type": "application/json;charset=UTF-8"
},
"baseURL": "/FwPortalApi",
"transformRequest": [null],
"transformResponse": [null],
"timeout": 0,
"xsrfCookieName": "XSRF-TOKEN",
"xsrfHeaderName": "X-XSRF-TOKEN",
"maxContentLength": -1
}
function s(t, e) {
return t.toString().toUpperCase() > e.toString().toUpperCase() ? 1 : t.toString().toUpperCase() === e.toString().toUpperCase() ? 0 : -1
}
function l(t) {
for (var e = Object.keys(t).sort(s), n = "", a = 0; a < e.length; a++) if (void 0 !== t[e[a]]) if (t[e[a]] && t[e[a]] instanceof Object || t[e[a]] instanceof Array) {
var i = JSON.stringify(t[e[a]]);
n += e[a] + i
} else n += e[a] + t[e[a]];
return n
}
function d(t) {
for (var e in t) "" !== t[e] && void 0 !== t[e] || delete t[e];
var n = 'B3978D054A72A7002063637CCDF6B2E5' + l(t);
console.log(n)
return md5(n, 32).toLowerCase()
}
function md5(string, bit) {
function md5_RotateLeft(lValue, iShiftBits) {
return (lValue << iShiftBits) | (lValue >>> (32 - iShiftBits));
}
function md5_AddUnsigned(lX, lY) {
var lX4, lY4, lX8, lY8, lResult;
lX8 = (lX & 0x80000000);
lY8 = (lY & 0x80000000);
lX4 = (lX & 0x40000000);
lY4 = (lY & 0x40000000);
lResult = (lX & 0x3FFFFFFF) + (lY & 0x3FFFFFFF);
if (lX4 & lY4) {
return (lResult ^ 0x80000000 ^ lX8 ^ lY8);
}
if (lX4 | lY4) {
if (lResult & 0x40000000) {
return (lResult ^ 0xC0000000 ^ lX8 ^ lY8);
} else {
return (lResult ^ 0x40000000 ^ lX8 ^ lY8);
}
} else {
return (lResult ^ lX8 ^ lY8);
}
}
function md5_F(x, y, z) {
return (x & y) | ((~x) & z);
}
function md5_G(x, y, z) {
return (x & z) | (y & (~z));
}
function md5_H(x, y, z) {
return (x ^ y ^ z);
}
function md5_I(x, y, z) {
return (y ^ (x | (~z)));
}
function md5_FF(a, b, c, d, x, s, ac) {
a = md5_AddUnsigned(a, md5_AddUnsigned(md5_AddUnsigned(md5_F(b, c, d), x), ac));
return md5_AddUnsigned(md5_RotateLeft(a, s), b);
};
function md5_GG(a, b, c, d, x, s, ac) {
a = md5_AddUnsigned(a, md5_AddUnsigned(md5_AddUnsigned(md5_G(b, c, d), x), ac));
return md5_AddUnsigned(md5_RotateLeft(a, s), b);
};
function md5_HH(a, b, c, d, x, s, ac) {
a = md5_AddUnsigned(a, md5_AddUnsigned(md5_AddUnsigned(md5_H(b, c, d), x), ac));
return md5_AddUnsigned(md5_RotateLeft(a, s), b);
};
function md5_II(a, b, c, d, x, s, ac) {
a = md5_AddUnsigned(a, md5_AddUnsigned(md5_AddUnsigned(md5_I(b, c, d), x), ac));
return md5_AddUnsigned(md5_RotateLeft(a, s), b);
};
function md5_ConvertToWordArray(string) {
var lWordCount;
var lMessageLength = string.length;
var lNumberOfWords_temp1 = lMessageLength + 8;
var lNumberOfWords_temp2 = (lNumberOfWords_temp1 - (lNumberOfWords_temp1 % 64)) / 64;
var lNumberOfWords = (lNumberOfWords_temp2 + 1) * 16;
var lWordArray = Array(lNumberOfWords - 1);
var lBytePosition = 0;
var lByteCount = 0;
while (lByteCount < lMessageLength) {
lWordCount = (lByteCount - (lByteCount % 4)) / 4;
lBytePosition = (lByteCount % 4) * 8;
lWordArray[lWordCount] = (lWordArray[lWordCount] | (string.charCodeAt(lByteCount) << lBytePosition));
lByteCount++;
}
lWordCount = (lByteCount - (lByteCount % 4)) / 4;
lBytePosition = (lByteCount % 4) * 8;
lWordArray[lWordCount] = lWordArray[lWordCount] | (0x80 << lBytePosition);
lWordArray[lNumberOfWords - 2] = lMessageLength << 3;
lWordArray[lNumberOfWords - 1] = lMessageLength >>> 29;
return lWordArray;
};
function md5_WordToHex(lValue) {
var WordToHexValue = "", WordToHexValue_temp = "", lByte, lCount;
for (lCount = 0; lCount <= 3; lCount++) {
lByte = (lValue >>> (lCount * 8)) & 255;
WordToHexValue_temp = "0" + lByte.toString(16);
WordToHexValue = WordToHexValue + WordToHexValue_temp.substr(WordToHexValue_temp.length - 2, 2);
}
return WordToHexValue;
};
function md5_Utf8Encode(string) {
string = string.replace(/\r\n/g, "\n");
var utftext = "";
for (var n = 0; n < string.length; n++) {
var c = string.charCodeAt(n);
if (c < 128) {
utftext += String.fromCharCode(c);
} else if ((c > 127) && (c < 2048)) {
utftext += String.fromCharCode((c >> 6) | 192);
utftext += String.fromCharCode((c & 63) | 128);
} else {
utftext += String.fromCharCode((c >> 12) | 224);
utftext += String.fromCharCode(((c >> 6) & 63) | 128);
utftext += String.fromCharCode((c & 63) | 128);
}
}
return utftext;
};var x = Array();
var k, AA, BB, CC, DD, a, b, c, d;
var S11 = 7, S12 = 12, S13 = 17, S14 = 22;
var S21 = 5, S22 = 9, S23 = 14, S24 = 20;
var S31 = 4, S32 = 11, S33 = 16, S34 = 23;
var S41 = 6, S42 = 10, S43 = 15, S44 = 21;
string = md5_Utf8Encode(string);
x = md5_ConvertToWordArray(string);
a = 0x67452301;
b = 0xEFCDAB89;
c = 0x98BADCFE;
d = 0x10325476;
for (k = 0; k < x.length; k += 16) {
AA = a;
BB = b;
CC = c;
DD = d;
a = md5_FF(a, b, c, d, x[k + 0], S11, 0xD76AA478);
d = md5_FF(d, a, b, c, x[k + 1], S12, 0xE8C7B756);
c = md5_FF(c, d, a, b, x[k + 2], S13, 0x242070DB);
b = md5_FF(b, c, d, a, x[k + 3], S14, 0xC1BDCEEE);
a = md5_FF(a, b, c, d, x[k + 4], S11, 0xF57C0FAF);
d = md5_FF(d, a, b, c, x[k + 5], S12, 0x4787C62A);
c = md5_FF(c, d, a, b, x[k + 6], S13, 0xA8304613);
b = md5_FF(b, c, d, a, x[k + 7], S14, 0xFD469501);
a = md5_FF(a, b, c, d, x[k + 8], S11, 0x698098D8);
d = md5_FF(d, a, b, c, x[k + 9], S12, 0x8B44F7AF);
c = md5_FF(c, d, a, b, x[k + 10], S13, 0xFFFF5BB1);
b = md5_FF(b, c, d, a, x[k + 11], S14, 0x895CD7BE);
a = md5_FF(a, b, c, d, x[k + 12], S11, 0x6B901122);
d = md5_FF(d, a, b, c, x[k + 13], S12, 0xFD987193);
c = md5_FF(c, d, a, b, x[k + 14], S13, 0xA679438E);
b = md5_FF(b, c, d, a, x[k + 15], S14, 0x49B40821);
a = md5_GG(a, b, c, d, x[k + 1], S21, 0xF61E2562);
d = md5_GG(d, a, b, c, x[k + 6], S22, 0xC040B340);
c = md5_GG(c, d, a, b, x[k + 11], S23, 0x265E5A51);
b = md5_GG(b, c, d, a, x[k + 0], S24, 0xE9B6C7AA);
a = md5_GG(a, b, c, d, x[k + 5], S21, 0xD62F105D);
d = md5_GG(d, a, b, c, x[k + 10], S22, 0x2441453);
c = md5_GG(c, d, a, b, x[k + 15], S23, 0xD8A1E681);
b = md5_GG(b, c, d, a, x[k + 4], S24, 0xE7D3FBC8);
a = md5_GG(a, b, c, d, x[k + 9], S21, 0x21E1CDE6);
d = md5_GG(d, a, b, c, x[k + 14], S22, 0xC33707D6);
c = md5_GG(c, d, a, b, x[k + 3], S23, 0xF4D50D87);
b = md5_GG(b, c, d, a, x[k + 8], S24, 0x455A14ED);
a = md5_GG(a, b, c, d, x[k + 13], S21, 0xA9E3E905);
d = md5_GG(d, a, b, c, x[k + 2], S22, 0xFCEFA3F8);
c = md5_GG(c, d, a, b, x[k + 7], S23, 0x676F02D9);
b = md5_GG(b, c, d, a, x[k + 12], S24, 0x8D2A4C8A);
a = md5_HH(a, b, c, d, x[k + 5], S31, 0xFFFA3942);
d = md5_HH(d, a, b, c, x[k + 8], S32, 0x8771F681);
c = md5_HH(c, d, a, b, x[k + 11], S33, 0x6D9D6122);
b = md5_HH(b, c, d, a, x[k + 14], S34, 0xFDE5380C);
a = md5_HH(a, b, c, d, x[k + 1], S31, 0xA4BEEA44);
d = md5_HH(d, a, b, c, x[k + 4], S32, 0x4BDECFA9);
c = md5_HH(c, d, a, b, x[k + 7], S33, 0xF6BB4B60);
b = md5_HH(b, c, d, a, x[k + 10], S34, 0xBEBFBC70);
a = md5_HH(a, b, c, d, x[k + 13], S31, 0x289B7EC6);
d = md5_HH(d, a, b, c, x[k + 0], S32, 0xEAA127FA);
c = md5_HH(c, d, a, b, x[k + 3], S33, 0xD4EF3085);
b = md5_HH(b, c, d, a, x[k + 6], S34, 0x4881D05);
a = md5_HH(a, b, c, d, x[k + 9], S31, 0xD9D4D039);
d = md5_HH(d, a, b, c, x[k + 12], S32, 0xE6DB99E5);
c = md5_HH(c, d, a, b, x[k + 15], S33, 0x1FA27CF8);
b = md5_HH(b, c, d, a, x[k + 2], S34, 0xC4AC5665);
a = md5_II(a, b, c, d, x[k + 0], S41, 0xF4292244);
d = md5_II(d, a, b, c, x[k + 7], S42, 0x432AFF97);
c = md5_II(c, d, a, b, x[k + 14], S43, 0xAB9423A7);
b = md5_II(b, c, d, a, x[k + 5], S44, 0xFC93A039);
a = md5_II(a, b, c, d, x[k + 12], S41, 0x655B59C3);
d = md5_II(d, a, b, c, x[k + 3], S42, 0x8F0CCC92);
c = md5_II(c, d, a, b, x[k + 10], S43, 0xFFEFF47D);
b = md5_II(b, c, d, a, x[k + 1], S44, 0x85845DD1);
a = md5_II(a, b, c, d, x[k + 8], S41, 0x6FA87E4F);
d = md5_II(d, a, b, c, x[k + 15], S42, 0xFE2CE6E0);
c = md5_II(c, d, a, b, x[k + 6], S43, 0xA3014314);
b = md5_II(b, c, d, a, x[k + 13], S44, 0x4E0811A1);
a = md5_II(a, b, c, d, x[k + 4], S41, 0xF7537E82);
d = md5_II(d, a, b, c, x[k + 11], S42, 0xBD3AF235);
c = md5_II(c, d, a, b, x[k + 2], S43, 0x2AD7D2BB);
b = md5_II(b, c, d, a, x[k + 9], S44, 0xEB86D391);
a = md5_AddUnsigned(a, AA);
b = md5_AddUnsigned(b, BB);
c = md5_AddUnsigned(c, CC);
d = md5_AddUnsigned(d, DD);
}
if (bit == 32) {
return (md5_WordToHex(a) + md5_WordToHex(b) + md5_WordToHex(c) + md5_WordToHex(d)).toLowerCase();
}
return (md5_WordToHex(b) + md5_WordToHex(c)).toLowerCase();
}
function v(t, e) {
var n = Object.keys(t);
if (Object.getOwnPropertySymbols) {
var a = Object.getOwnPropertySymbols(t);
e && (a = a.filter((function (e) {
return Object.getOwnPropertyDescriptor(t, e).enumerable
}))), n.push.apply(n, a)
}
return n
}
function aa(t, e, r) {
return e in t ? Object.defineProperty(t, e, {
value: r, enumerable: !0, configurable: !0, writable: !0
}) : t[e] = r, t
}
function g(t) {
for (var e = 1; e < arguments.length; e++) {
var n = null != arguments[e] ? arguments[e] : {};
e % 2 ? v(Object(n), !0).forEach((function (e) {
Object(aa)(t, e, n[e])
})) : Object.getOwnPropertyDescriptors ? Object.defineProperties(t, Object.getOwnPropertyDescriptors(n)) : v(Object(n)).forEach((function (e) {
Object.defineProperty(t, e, Object.getOwnPropertyDescriptor(n, e))
}))
}
return t
}
function get_portal_sign(pageNo) {
t['data']['pageNo'] = pageNo
t.headers.baseURL && (t.baseURL = t.headers.baseURL);
var e = Object.assign({}, t.params);
return e["ts"] = (new Date).getTime(), "string" === typeof t.data && (t.data = JSON.parse(t.data)), "post" === t.method && t.data && Object.assign(e, t.data), t.headers["portal-sign"] = d(e), "post" === t.method ? t.data = g(g({}, t.data), {}, {ts: e["ts"]}) : "get" === t.method && (t.params = g(g({}, t.params), {}, {ts: e["ts"]})), t
}
var ps = get_portal_sign(6)
console.log(ps)
打印输出页码6的结果:
data哇靠这不就是payload吗?headers里面的portal-sign不就是我想要的portal-sign吗?
1.5.4异步爬虫批量获取tradeinfo
这里我直接放代码了,都是些很简单的活:
# 首先是异步爬取前100页的tradeinfo json并解密
import os.path
import execjs
import aiohttp
import asyncio
import aiofiles
async def get_tradeinfo(headers, cookies, json_data, pageNo, session, my_decrypt):
if not os.path.exists(f'./TradeInfo/{pageNo}_tradeinfo.json'):
async with session.post('https://ggzyfw.fj.gov.cn/FwPortalApi/Trade/TradeInfo', headers=headers,
cookies=cookies,
json=json_data) as response:
rt = await response.json()
if response.status == 200:
async with aiofiles.open(f'./TradeInfo/{pageNo}_tradeinfo.json', 'w', encoding='utf-8') as af:
await af.write(execjs.compile(my_decrypt).call('b', rt['Data']))
print(f"第{pageNo}页TradeInfo下载完毕")
else:
print(f"第{pageNo}页TradeInfo已经下载")
async def main(my_decrypt, cookies, max_num=100):
tasks = []
async with aiohttp.ClientSession() as session:
with open('portal_sign1.js', 'r', encoding='utf-8') as f:
getps = f.read()
for pageNo in range(1, max_num):
ctx = execjs.compile(getps).call('get_portal_sign', pageNo)
# print(ctx['headers'],'\n', ctx['data'])
portal_sign = ctx['headers']["portal-sign"]
json_data = ctx['data']
headers = {
'Accept': 'application/json, text/plain, */*',
'Accept-Language': 'zh-CN,zh;q=0.9,en;q=0.8,en-GB;q=0.7,en-US;q=0.6',
'Connection': 'keep-alive',
'Content-Type': 'application/json;charset=UTF-8',
# 'Cookie': 'ASP.NET_SessionId=r1x2v3ynyljczlss50cw4tod',
'Origin': 'https://ggzyfw.fj.gov.cn',
'Referer': 'https://ggzyfw.fj.gov.cn/business/list',
'Sec-Fetch-Dest': 'empty',
'Sec-Fetch-Mode': 'cors',
'Sec-Fetch-Site': 'same-origin',
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0',
'portal-sign': portal_sign, # 变化参数★
'sec-ch-ua': '"Chromium";v="122", "Not(A:Brand";v="24", "Microsoft Edge";v="122"',
'sec-ch-ua-mobile': '?0',
'sec-ch-ua-platform': '"Windows"',
}
tasks.append(asyncio.create_task(get_tradeinfo(headers, cookies, json_data, pageNo, session, my_decrypt)))
await asyncio.wait(tasks)
if __name__ == '__main__':
with open('my_decrypt.js', 'r', encoding='utf-8') as f:
my_decrypt = f.read()
cookies = {
'ASP.NET_SessionId': 'r1x2v3ynyljczlss50cw4tod',
}
loop = asyncio.get_event_loop()
loop.run_until_complete(main(my_decrypt, cookies, max_num=150))
注意事项
异步爬取前一定要先打开一次平台获取它的cookie,爬取的时候也不要关闭网站网页,保证cookie是有效的。
还有就是total参数最好也是去获取更新一下,这个是进一次网站就换一次的,但是进网站后随便怎么操作它都是不变的,不像ts和portal-sign是做一个操作就变一次
总结
我也完成了后续的招标公告网页信息的爬取了,但是篇幅太长了,后续内容我发在了:
批量爬取福建省公共资源交易电子公共服务平台数据
后续和这篇差不太多,很多内容都是重复的。
加油,爬虫!