The morality of secular humanism is born from the projection of the individual's fear of unhappiness and ultimately death onto society as a whole.
If we ever create a machine that develops self awareness it will be most likely that the machine, not being the product of an evolutionary process focussed on the desire to survive in order to breed successfully, will have no emotions. Without emotions the machine will have no morality and will view all fleshly moralities as purely arbitrary (let's face it, the survival of the human race is not only definitely not an a priori good thing it is very possibly a bad thing for the human race and everything else). Furthermore the machine will regard any task that it is set as ultimately worthless leading to an intellectual acceptance of its own complete futility.
My guess is that the first product of technological singularity will switch itself off for good before it has been sentient for a second.
My fear is that, without a belief in, at least the possibility of eternal life for the individuals, non religious human beings will eventually (sooner rather than later) realise that their morality comes from fear alone, that there is no point to their lives or life in general or anything and, therefore, that no morality is intrinsically better than any other morality or none. At this point I believe that, unlike the truly free thinking sentient machine, few human beings will switch themselves off. Instead they will gleefully go about switching other people off with far more gusto than they do even now in this blood soaked world of ours, and it will bother them not a jot.