The many-rows insert has one serious side effect. It can lock a table for a long time, which is unacceptable in multi-user environments. However, this is not the case in the example scenario, where you insert just a few hundred rows in one shot. That can't cause the locking problem.
The 8,000-byte limit to the length of a varchar variable produces another inconvenience. However, you can solve that problem by storing incoming strings in a separate table and running another process that checks for the completed sets of the strings that belong to the same survey and user submission. Then you can insert such a set into the working table asynchronously.
In SS2005, where the varchar(max) data type can store up to 2 GB, you have much more flexibility. You can adjust the length of the string to any size up to 2 GB and try to get the optimal performance of the string inserts.
One last note: validate data in the body of your stored procedures. Although it will make your stored procedures heavier, your string inserts still will be much faster than repetitive one-row inserts.