Jump to content

Multiple DS read from internet


JCFC

Recommended Posts

Hi to all

I want to extract data every 30 seconds from 200+ web pages, those pages are in kml format (Google Earth).

I tried to do this:

dskml.png

But since some of the pages fails (and i mustn't skip them) all the loop takes more than 30 seconds, and the loop runs every 60 seconds to maintain in phase.

I noticed Google Earth loads every page every 30 seconds as configured, regardless if some of them present problems (page not available/timeout exceeded), how can i replicate that behaviour?

Thanks in advance

PD: the pages have this format:

<?xml version="1.0" encoding="UTF-8" ?>

- <kml xmlns="http://earth.google.com/kml/2.1">

- <Placemark>

- <description>

- <![CDATA[ License Plate= (Licence Plate)<br>Chassis = Chassis S/N<br>Model= (model)<br>Bearing= (N/S/E/W)<br>Signal= Signal_Strength<br>Speed= Speed Km/h<br>RPM = (RPM)<br>Date= Dec 8 2010 10:41AM<br>Company= Company_Name

]]>

</description>

<name>License Plate</name>

- <Style>

- <IconStyle>

- <Icon>

<href>stop.png</href>

</Icon>

</IconStyle>

</Style>

- <Point>

<coordinates>(Coordinate)</coordinates>

</Point>

</Placemark>

</kml>

Edited by JCFC
Link to comment

The simple solution is to adjust the parallel for loops settings like so.

post-7834-0-12578800-1291941324_thumb.pn

The problems with this are your attempting P network connections at once, which might cause problems if P is really high, but if the amount of troublesome links is larger than P then It'll slow down some, also if a slow page finishes then that loop instance gets another slow page it might be too slow.

Since your time out is smaller than the loop time, P doesn't really need to be large than the number of troublesome pages.

I think

ceil(numberofpages / floor(looptime/timeouttime))

for the P value (with the generated loops set to value larger than P will ever be) should work.

You could also set C to

floor(looptime/timeouttime)

for some performance, but I don't think you'd notice the difference.

The advanced solution is dynamically launch multiple copies of a VI that handles a request and dynamically launch another when a request seems to be slow.

Link to comment

Thanks for reply

I tried to do that in a "strange" way: I splitted the array in 4, put 4 For loops and wired the size of each array to the parallel instances terminal. At first the program worked as intended, but after some iterations i lost the access to the internet. Looking the logs at the router, i noticed this message: "192.168.1.38 exceeds the max. number of session per host!"

So i modified the program again:

twofor.png

This works again as intended, but after some iterations i noticed that the pages on internet take longer to load

- Am i doing this correctly?

- Is possible to see the number of sessions open?

- How can i avoid the "exceeds the max. number of session" message?

- Any link to read?

Thx in advance

Link to comment

Thanks for reply

I tried to do that in a "strange" way: I splitted the array in 4, put 4 For loops and wired the size of each array to the parallel instances terminal. At first the program worked as intended, but after some iterations i lost the access to the internet. Looking the logs at the router, i noticed this message: "192.168.1.38 exceeds the max. number of session per host!"

So i modified the program again:

twofor.png

This works again as intended, but after some iterations i noticed that the pages on internet take longer to load

- Am i doing this correctly?

- Is possible to see the number of sessions open?

- How can i avoid the "exceeds the max. number of session" message?

- Any link to read?

Thx in advance

If you configure the parallelism in the for loop you can have up to 64 parallel instances.

I meant this (note it's untested, so could have bugs).

post-7834-0-06582500-1291963672_thumb.pn

The max session limit is a setting on the router, looking up that message on google turns up Zywall routers as having a rather low limit.

Some info on fixing that type of router (assuming that's what you have).

http://www.broadbandreports.com/forum/r9235388-How-many-is-the-max.-number-of-session-per-host-

http://www.broadbandreports.com/forum/remark,9094710~mode=flat?hilite=sessions

http://www.broadbandreports.com/forum/r17252762-ZyWall-2-max.-number-of-session-per-host-

I'm not sure if the datasocket caches it's connections, if it does explicitly opening and closing the datasocket might help.

Link to comment

Thx, i'll try this and write back with the results

Where can i find info about this?

I was mistaken you can use recursion, instead of messing with the VI server. I made an example in the attached zip (note poorly tested, and poorly documented since I couldn't spend too much time on it). In the example URLS of "slow" take 500 ms, "timeout" takes what ever the timeout limit is, and anything else takes 100ms.

The simplest and fastest way (although it uses the maximum amount of connections), is to set C to 1 and P, as well as the generated number of loops, to the maximum amount of connections you can have at once.

Multiread Recur Example.zip

Link to comment

The first one seems easy, but the "max number of sessions" appeared again, even with the DS open and close (Nat sessions value is 512 in the router, doesn't seem a low value)

How can i check that value programmatically?

======================================

I think i'm misunderstanding the problem, it's not processing all the array every 30 seconds, instead i have to process every element of the array every 30 seconds, something like this:

singleds.png repeated arraysize.pngtimes

So the program must execute like this:

dsadinfinitum.png(ad infinitum)

But maybe (just maybe xD) a program with 200+ loops isn't the more elegant solution nor a maintainable program (maybe if the program modifies itself and place while loops as needed...)

Probably the second proposed solution is the one i want, but i don't fully understand it yet.

Any other suggestion, please?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.