Jump to content

Antinome

Members
  • Posts

    5
  • Joined

  • Last visited

Posts posted by Antinome

  1. Why not store the files on a common server?

    \\servername\AppName\StationName1\Datafile1.dat

    \\servername\AppName\StationName1\Datafile2.dat

    \\servername\AppName\StationName2\Datafile1.dat

    \\servername\AppName\StationName2\Datafile2.dat

    The files can be backed up by IT, and access is defined by network, not local account names and shares. Local networking shares with read/write access is one of the most common methods that viruses (virii?) spread... :nono:

    I'm not maintaining datafiles- I'm maintaining desktop links to applications and work instructions- which are on the network.

    If IT was using Active Directory ... I'd use that, but they aren't.

    Looks like the WNetAddConnection2A method may work....

  2. A more 'modern' and reliable way to do this would be to turn or LabVIEW Web Services IIS or some other web server on each computer, and configure it to serve up the data on request. IT department can seem crazy, but making sure each computer can serve port 80 requests is a lot more secure than allowing random other computers to access \host\c$.

    It seems like your system is already 98% complete, but if not I wanted to suggest it.

    Jason

    I could do it with a tftp server or something like that, but really i was trying to avoid having to go around and install/configure files on ~60 computers.

    The number of computers also makes the mapped network drive method less than ideal... can you even have a AZ: drive? Also mapped drives aren't transportable... i couldn't give the program to my fellow engineers.

    The thing about this is I can already access \\host\c$ on these computers... using different names and passwords I already have.

    There must be some sort of windows api call that does what I'm looking for.

  3. Perhaps this is an easy question but I'm banging my head against a wall.

    Trying to monitor files on a bunch of production test computers that operators have access to. I have a computername, login name for a local admin, and password for each computer.

    Obviously if my IT department would configure any one account as the admin of all these computers I could access it that way, but they won't.

    The files are all in a common location on each computer, say \\computername\c$\etc

    Windows will cache the login/password of each computer if I go there first manually, and then it works. But that doesn't allow me to share the file monitoring tool with anyone else.

    Is there a way to specify for each directory path exactly what windows user I want it to connect as?

  4. The GetDocument only checks whether the type in the header is text or not. If it is text, it saves it as an ascii file. If it isn't, it saves it as a binary file. If you don't wire a path to the file terminal, it will not save anything (and won't prompt you for a filename).

    Yeah, that works. Misunderstanding on my part on how this VI works. I expected to see to binary data in the content output which I could dump to a binary file and a TRUE on the success output even if I didn't wire in a file. Then I tried wiring in a path to save the file but not a complete path+filename assuming it would use the same name. Even with a complete path+filename I still see no content and success=FALSE even though the file is saving correctly. I can live with that.

    FWIW, this 3rd party http client does behave a lot more like I expected. I'll use the toolkit solution anyway, to minimize dependencies.

    Thanks all.

    Will Peterson

  5. I'm trying to interface to a document repository (Omnify)

    I can't access the repository directly, but I can give an ActiveX object part numbers and revisions and get back from it an http address in the form:

    http://SERVER/OmnifyWebServices/DownloadDocument.aspx?file=FILENAME.pdf

    which I can put in Firefox or IE and get a pop-up window to save the attached file. I'd like that to be automatic.

    Neither of the obvious tools packaged with the LV internet kit work ('URL Get Document', 'URL Get HTTP Document').

    After a bit of thought that's kind of obvious. Both 'URL Get Document' and 'URL Get HTTP Document' are looking for files, not an application/octet-stream.

    Any ideas?

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.