Jump to content
UBot Underground

Scrape All Page In Google With Http Post


Recommended Posts

clear list(%Link Blogspot Profile)

ui block text("Keyword"#Keyword)

ui stat monitor("Total Link Profile"$list total(%Link Blogspot Profile))

add list to list(%Keyword$list from text(#Keyword$new line), "Delete""Global")

ui save file("Save File Location (*.txt)"#Save File Location)

clear list(%Increment Page)

ui drop down("Footprint Blog""site:blogger.com/profile,site:blogger.com/profile admin,site:blogger.com/profile keyword"#Footprint Blog)

add list to list(%Increment Page$list from text("0

10

20

30

40

50

60

70

80

90

100

110

120

130

140

150

160

170

180

190

200

210

220

230

240

250

260

270

280

290

300

310

320

330

340

350

360

370

380

390

400

410

420

430

440

450

460

470

480

490

500

510

520

530

540

550

"$new line), "Delete""Global")

loop while($comparison($list total(%Keyword), ">", 0)) {

    set(#google_results$plugin function("HTTP post.dll""$http get""https://www.google.com/search?q={#Footprint Blog} \"{#Keyword}\"&start={$list item(%Increment Page, 0)}"$plugin function("HTTP post.dll""$http useragent string""Firefox 27.0 Win7 64-bit"), """"""), "Global")

    add list to list(%Link Blogspot Profile$plugin function("HTTP post.dll""$xpath parser"#google_results"//div//h3//a[contains(@onmousedown,\'rwt\')]""href""HTML"), "Delete""Global")

    remove from list(%Increment Page, 0)

    set(#google_results$nothing"Global")

    if($comparison($list total(%Increment Page), "=", 0)) {

        then {

            add list to list(%Increment Page$list from text("0

10

20

30

40

50

60

70

80

90

100

110

120

130

140

150

160

170

180

190

200

210

220

230

240

250

260

270

280

290

300

310

320

330

340

350

360

370

380

390

400

410

420

430

440

450

460

470

480

490

500

510

520

530

540

550

"$new line), "Delete""Global")

        }

        else {

        }

    }

    save to file(#Save File Location%Link Blogspot Profile)

}

 

 

 

 

I want to scrape link in all page in google but the script didn't work when the %increment page greater than the start=40 , any solution?  

Link to post
Share on other sites

I don't have Ubot installed at the moment but judt glancing at your code...

 

You are probably hitting a captcha and I don't see any proxy usage.

 

Private proxies are best

 

CD

 

ok thanks for your suggest but that's not problem at all , any solution? please help

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...