Jump to content
UBot Underground

Help with clicking(looping)


Recommended Posts

Guys i need some help with this you see i am trying to build something i am stuck here

 

http://www.gramfeed.com/instagram/tags#models

 

I am trying to click on these smaller circled images because these take you to the Profile of the user i tried to scrape it and put it in a list but no luck i attached a picture so you can see which info exactly i need to click to , or save links in a list whatever you think is best .Because i am trying to make Ubot loop trough these links using either positions or something i am trying to do this for like 2 days now no luck,if someone could help i would appreciate it.

 

 

Example 2

http://www.bluemountainpeak.com/catalogsearch/result/?q=00

 

So here i am trying to click on every product Individually then scrape some info go back to the site and choose next product on the webpage.

 

I would be grateful if you could take a look at these two i am stuck with this!

 

Thanks Guys!

post-18544-0-94032600-1416215635_thumb.jpg

Link to post
Share on other sites

Hey itexspert, 

Let me know if this helps you out. For some reason, gramfeed wouldn't do anything for me via Ubot browser. Wouldn't click anything. not sure if it was just me, but here what I came up with for both sites. (Same Method) Couldn't get clicking to work at all. 


http://www.gramfeed.com/instagram/tags#models:

clear cookies
set user agent("Chrome")
navigate("http://www.gramfeed.com/instagram/tags#models", "Wait")
set(#Profile_URL, $scrape attribute(<class="photo_user">, "title"), "Global")
add list to list(%Profile_URL, $list from text(#Profile_URL, $new line), "Delete", "Global")
loop($list total(%Profile_URL)) {
    navigate("http://www.gramfeed.com/{$next list item(%Profile_URL)}", "Wait")
    wait(10)
}

http://www.bluemountainpeak.com/catalogsearch/result/?q=00:

 

clear cookies
set user agent("Chrome")
navigate("http://www.bluemountainpeak.com/catalogsearch/result/?q=00", "Wait")
set(#Product_Url, $scrape attribute(<div,class="details-wrap">, "innerhtml"), "Global")
set(#Product_Url, $replace($find regular expression(#Product_Url, "(?<=\\<a href\\=\\\").*?(?<=\\\" title)"), "\" title", ""), "Global")
add list to list(%Product_Url, $list from text(#Product_Url, $new line), "Delete", "Global")
loop($list total(%Profile_URL)) {
    navigate($next list item(%Profile_URL), "Wait")
    wait(10)
}

Let me know if it's helps you or not. 


Regards,
HaHaItsJake

Link to post
Share on other sites

One more thing is there a way to limit how many URL-s will be visited from the list

 

 

clear cookies
set user agent("Chrome")
navigate("http://www.gramfeed.com/instagram/tags#models", "Wait")
set(#Profile_URL, $scrape attribute(<class="photo_user">, "title"), "Global")
add list to list(%Profile_URL, $list from text(#Profile_URL, $new line), "Delete", "Global")
loop($list total(%Profile_URL)) {<<<<<<<<<Is there a way to add Prompt Command here so that User can limit how many users he wants to visit? or add a Ui textbox with variable in which User can just write how many users he wants to visit from the list is that possible?

 

navigate("http://www.gramfeed.com/{$next list item(%Profile_URL)}", "Wait")
wait(10)
}

Link to post
Share on other sites

What settings do you mean? Chrome is just the user agent that their websites picks up. You can change that to what ever. Setting it to mobile is smart for some websites. 

For the following code, theirs 2 ways of doing it. In the UI, or a Dialog box. (I can't remember for the life of me what the command was. I can't find the script I have it in either. I believe it's a plugin...)

But the UI would be simple, 

 

ui text box("How mNay URLS", #Urls)
loop(#Urls) {
}

Regards,
HaHaItsJake


EDIT: Found the command. Duhhh. Yeah it will be: 

 

loop($prompt("How many urls")) {
}

Or add the $prompt to a $set.

 

Edited by HaHaItsJake
  • Like 1
Link to post
Share on other sites

Hey guys try to open this site in your Ubot

 

http://www.ripoffreport.com/r/czar-delivery-usa-inc-/coral-springs-florida-33076/car-delivery-usa-inc-all-day-auto-transport-fraudulent-bait-and-switch-said-800-at-deli-1184929

 

I am trying to scrape address and Phone number,Web,and web address but my Ubot keeps shutting down????

 

Can you guys try is this happening to you?

Link to post
Share on other sites

Hey guys try to open this site in your Ubot

 

http://www.ripoffreport.com/r/czar-delivery-usa-inc-/coral-springs-florida-33076/car-delivery-usa-inc-all-day-auto-transport-fraudulent-bait-and-switch-said-800-at-deli-1184929

 

I am trying to scrape address and Phone number,Web,and web address but my Ubot keeps shutting down????

 

Can you guys try is this happening to you?

Loading perfectly in Ubot 4. Can play around with it too. 

 

Try the following browser settings after saving all your work and re-opening Ubot Complete (Maybe a Reboot too?) 

 

The numbers and everything still come up but load %80 faster for me. 

set referrer("google.com")
set user agent("Chrome")
allow flash("No")
allow javascript("No")
allow popups("No")
allow css("No")
allow images("No")
navigate("http://www.ripoffreport.com/r/czar-delivery-usa-inc-/coral-springs-florida-33076/car-delivery-usa-inc-all-day-auto-transport-fraudulent-bait-and-switch-said-800-at-deli-1184929", "Wait")

Regards,

HaHaItsJake

  • Like 1
Link to post
Share on other sites

Ok it worked temporarily this is the new issue

 

clear cookies
set user agent("Android")
allow flash("No")
allow javascript("No")
allow popups("No")
allow css("No")
allow images("No")
increment(#row)
clear list(%ripoff)
allow images("No")
navigate("http://www.ripoffreport.com/c/483/automotive/auto-mechanics""Wait")
wait(10)
set(#ripoofreport$scrape attribute(<href=w"http://www.ripoffreport.com/r/*">"href"), "Global")
add list to list(%ripoff$list from text(#ripoofreport$new line), "Delete""Global")
loop($list total(%ripoff)) {
    navigate($next list item(%ripoff), "Wait")
    clear cookies
    set user agent("Android")
    wait(10)
    set(#adresaripoff$scrape attribute(<class="address">"innertext"), "Global")
    add item to list(%Scraped Results$list from text(#adresaripoff$new line), "Delete""Global")
    set(#inforipoff$scrape attribute($element offset(<tagname="ul">, 13), "innertext"), "Global")
    add list to table as column(&ripoffreport#row, 0, %Scraped Results)
    set table cell(&ripoffreport#row, 1, #inforipoff)
}

 

 

Idea behind this is that

 

we go to some category from this site ripoff

then Scrape all links in this category we are in

put it in the List

and visit them one by one and scrape info i need

 

Problem is Ubot V4 and V5 are stuck on loading page

remember that list we created with links which leads you directly to the page from which i need to scrape info

well after Ubot goes to this link it gets stuck on loading browser cant seem to load the content from the page??

 

Why is this happening i am trying but no luck with both versions!

Link to post
Share on other sites

:) You got some glitches in there. To learn, compare your code with mine. I also added comments along the way. 

I didn't run it, so the scrape might be right. Having a Element offset on more then 1 link will get messed up. 

 

 

 

clear cookies
set user agent("Android")
allow flash("No")
allow javascript("No")
allow popups("No")
allow css("No")
allow images("No")
comment("Need to set the row to 0. Can't Increment because it doesn't have a value. 
That in return will screw up the Table Cells at the end of the loop.")
set(#row, 0, "Global")
clear list(%ripoff)
comment("Need to clear the table, or it will be added on.")
clear table(&ripoffreport)
comment("Had an extra allow images. I deleted it. Not need for it.")
navigate("http://www.ripoffrep.../auto-mechanics", "Wait")
wait(10)
set(#ripoofreport, $scrape attribute(<href=w"http://www.ripoffreport.com/r/*">, "href"), "Global")
add list to list(%ripoff, $list from text(#ripoofreport, $new line), "Delete", "Global")
loop($list total(%ripoff)) {
    navigate($next list item(%ripoff), "Wait")
    comment("The reason why it's not loading is the fact that you go to the page, then it clears the cookies and sets the user agent. It's already set and wouldn't clear the cookies unless you're using proxies.")
    wait(10)
    set(#adresaripoff, $scrape attribute(<class="address">, "innertext"), "Global")
    add item to list(%Scraped Results, $list from text(#adresaripoff, $new line), "Delete", "Global")
    comment("You need to fix the inforripoff scrape. It's nopt scraping after the first URL.")
    set(#inforipoff, $scrape attribute($element offset(<tagname="ul">, 13), "innertext"), "Global")
    add list to table as column(&ripoffreport, #row, 0, %Scraped Results)
    set table cell(&ripoffreport, #row, 1, #inforipoff)
comment("For the Row on the Table cells, it will replace the whole table each loop. I would either
move the Add List To Table outside the loop. Or find a way to get the list total and increamnt it. Or Table Row Total for the Row.")
}
Regards,

HaHaItsJake

  • Like 1
Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...