The Developer's Guide has some helpful information in this regard - https://ar.qualcomm.at/qdevnet/developer_guide/ .
Take a look at the sections on Trackables and AR App Design.
The critical factors related to user experience are going to be lighting, distance, occlusion, and perspective. As you've noticed, the tracking will tolerate a broad range of lighting conditions, but will work best w/ illumination appropriate for reading (e.g. you'll see quicker target detection in this range). Glare and reflection tend to blow-out a/o confuse the regions of the image that they affect, which will impact detectability and tracking. But this typically isn't a significant issue unless you're using a glossy print media and bright direct lighting on the target.
Some basic troubleshooting tips for users are ..
1. Make sure that you can see all of the details of the image clearly on screen, and that the entire image is in view.
2. Try viewing the image at a slight angle.
These will ensure that the target is detected. If the details of the image are clear to the human eye, they will be apparent to the Tracker, and if the entire image is in view on screen, then all of the image's features will be available to the Tracker. One problem that I've often seen new users encounter, if they are unfamiliar with AR, is that they'll place the camera too close to the target. This reduces the area of the image captured by the camera.
Number 2 can be helpful with marginal targets because tracking actually benefits from some perspective distortion. Also novices will sometimes assume that the camera has to be parallel to the plane of the target, or directly over the target's center. This impacts their user experience because they're not moving the camera freely. So telling them to view the target at an angle helps to dispel this assumption.
Feel free to PM me your doc if you'd like some feedback on it.