dslreports logo
 
    All Forums Hot Topics Gallery
spc
uniqs
17

PToN
Premium Member
join:2001-10-04
Houston, TX

PToN to JoelC707

Premium Member

to JoelC707

Re: Push Application and other files once committed.

- The customer send us a test program to run on certain device
- This test program files are placed in a directory "tests" within the "app" folder network share
- When the client PC runs a test, they double click the shortcut to the app's exe which is in the network share. All the app files (excluding the tests) are loaded into memory.
- The tester goes go to file -> open and selects the test to run
- When the test starts, every single output/result is logged into several files in the same network share.

What i am trying to accomplish it is to eliminate having to have the network as a point of failure. I was thinking that something like Git or SVN would help me doing this by having the clients update from the central "repository", then when the test is done, have the clients push the changes/additions to the origin.

This way, they all have access and are able to run the tests even when the network goes down for whatever reason.

I may be complicating things, but i need to present several solutions to eliminating downtime on these machines.

Thanks.

guppy_fish
Premium Member
join:2003-12-09
Palm Harbor, FL

guppy_fish

Premium Member

I would focus on why the network can't be relied on ... the traffic is uses is trivial
JoelC707
Premium Member
join:2002-07-09
Lanett, AL

JoelC707 to PToN

Premium Member

to PToN
No, I don't think that's complicating things at all. I think you're on the right track with this. It's really just like application development and the Git/SVN/etc. approach is how that is usually handled and it works just fine.

I am curious though, have you had network issues in the past? Any to the point this has become a serious issue? Or are you just trying to be proactive and solve it before it becomes an issue?

PToN
Premium Member
join:2001-10-04
Houston, TX

PToN

Premium Member

Just trying to cover all the what-ifs.

Plus, we recently had an scenario where some admin accidentally moved several files and they were thought to be deleted. Backups were no good as when they were restored, all the restored data was garbage. (dont ask why because i dont know why).

So thinking on ways that i can cover that kind of horrors, when the date is lost, backups are crap and lets add that the server (let me change network to server as the network is pretty redundant, should had said it better at first) crashes.

Having the clients have a local copy of whats needed, it could had allowed work to continue even if everything was lost.

Plus it is already noted to correct why the back ups failed, etc. Just looking for other ways to keep it going.

Thanks.