Selection in Augmented Reality (AR) is a common task that can be used to, for example, select an item from a menu. There are many approaches to creating a selection interaction such as controller, gesture or gaze input, each having their pros and cons. Independent of any particular implementation, the interaction should be fast and usable since it is a common task. Furthermore, the cognitive load should be kept as low as possible for the best user experience. A typical menu in augmented and virtual reality is a window with buttons fixed in virtual space, which forces the user to keep in mind where the menu is located or forced to relocate to use it. One type of design that solves this is hand-centric design. This design anchors objects in the virtual space around the user’s hand which allows the interaction to always be in the same relative location to the user. We created a finger based (SwipePie) and a palm based design (The Widget Pallet). These designs were implemented in a prototype AR system called DatAR and evaluated using a human-centred design approach. DatAR is part of a research endeavour to explore interacting with large numbers of neuroscience publications. DatAR provided an environment to implement the designs and provide suitable tasks for evaluating them. Our goal is both to improve the selection interaction in the DatAR system, and to create a general solution that can be used for similar systems. The results from a pilot study indicated that participants preferred the Widget Pallet designs over the SwipePie designs. A second evaluation examined how the Widget Pallet designs compared based on our three criteria: speed, cognitive load and usability. The results indicate no statistical difference on all criteria.