I think Greg McKaskles explanation from the "Clear as Mud" thread on the dark-side explains the hit about not wiring the input under the section where he talks about the default value having to be supplied when not wired.
I generally look at the code for obvious errors that could occur either by mis-use by others (or myself) or what would be the follow-up effect, of the code not working and how difficult it would be to diagnose an error based on the error cluster info. For code that touches a lot of stuff for the first time, I will use nest error cluster so that I can clearly diagnose a file I/O error from the DAQ error that could result from a bad config (a file error) or the hardware is shut-off.
In the early days of UNIX there was no error recovery or logging built into the OS. The philosophy was "well fix the hardware then start the OS." That left a bad mark on me so know I "drop bread crumbs" in my code so that I can nail issue if the they come up.
But not all of my code is wrapped in error clusters. Number crunching, bit banging etc...
I appreciate the report about the performance being about the same between through wire vs not.
even if there was a performance hit that could be meassured, I'd still use error clusters for all but the most demanding performance situations. Anybody can drive a car 100 MPH, but to do it safely is another story.
I once posted here about the "extra inputs" on the icon connector actually incurring a performance hit that could be measured under the right conditions. Even after learning that fact, I still include extra connectors on the icon, to mkae my life easier, even if ther may never be a need.
Take care,
Ben