Any software is in itself (on a very basic level) already a decision maker, as it has it's set of parameters, in the limits of which it can function and as such take "decisions".
If you want a full, self-aware AI, then you would have to go further than that. It would have to be able to learn and make decisions outside of it's established parameters (call it thinking outside of the box, lol, quite literally), and to a degree rewrite it's own parameters, like we humans do; rewriting it's own base code. That might seem quite straight forward, but that would be something quite complicated, as far as programming is concerned.
Certain programs can mimick decision making processes, like humans do it, but in terms of real self-awarness, we still have a long way to go. Again, this is my opinion only.