Activity recognition plays a key role in providing information for context-aware applications. When attempting to model activities, some researchers have looked towards Activity Theory, which theorizes that activities have objectives and are accomplished through tools and objects. The goal of this paper is to determine if hand posture can be used as a cue to determine the types of interactions a user has with objects in a desk/office environment. Furthermore, we wish to determine if hand posture is user-independent across all users when interacting with the same objects in a natural manner. Our initial experiments indicate that a) hand posture can be used to determine object interaction, with accuracy rates above 94% for a user-dependent system, and b) hand posture is dependent upon the individual user when users are allowed to interact with objects as they would naturally.