I'm also learning/using a 3706A with Labview, and might be able to provide some insight.
There are two main ways to programmatically take measurements (that I'm aware of). Calling the DMM directly, with dmm.measure(), and using the Scan Model with scan.execute().
When you create a scan, scan has two properties that will determine how many readings you will take. scan.scancount, and scan.measurecount.
Scan.scancount = how many times to run the scan
Scan.measurecount = how many measurements to take per channel per scan
Also, make sure you set the buffer size to appropriate number with bufferVar = dmm.makebuffer(n), and set the buffer to append with bufferVar.appendmode = 1 (actually, this may not be necessary on a single scan). dmm.measurecount is how many filtered (in your case) measurements to take when you execute dmm.measure(bufferVar).
When you set the dmm.filter.count, that is for how many readings to take to create one filtered reading, using dmm.filter.type. You apply dmm configurations to channels, and then implement a scan to iterate over the channels in the scan list. You don't have to use a scan, you could directly call dmm.measure(bufferVar)
The 3706A also has Lua, a scripting language, on it. Personally, I'm using scripts that define Scan methods, which I can pass parameters (like the channel list, scan count, etc), to execute scans flexibly, and then reading the buffer back. In Labview, I'm using the Script Execute .vi to execute the scripts.
Let me know if this information makes sense
edit : Oh, this is from 2015...