In their review article "trends in Steganography" (Mar. 2014), ElŹbieta Zielińska et al. included a good survey of the history of data hiding and a comprehensive list of methods for inserting bits into cover objects but omitted an important actor from the scene—the enemy. In computer security, a system is secure only if it prevents the enemy from achieving certain specified goals. If the primary aim of steganography is to communicate covertly (as the article said correctly), then the enemy is someone—a "steganalyst"—who is able to monitor communications to detect covert communication. Such a scenario reflects reality in light of today’s pervasive monitoring of the Internet by intelligence agencies and criminals.
Researchers would do well to identify new media that supports covert communication but only if they also prove it is not actually detectable by an enemy using the tools (such as statistical analysis and machine-learning algorithms) needed to unmask hidden data. I would not want a Communications reader to imagine steganographers forget they indeed have enemies, far from it. But few steganography methods survive long once researchers start trying to detect them through statistical methods. The only exceptions are ultra-low-bandwidth mechanisms that find perfectly random parts of the cover (where hiding is trivial) or highly refined methods that exploit coding theory and distortion minimization applied (at least in recent years) to image steganography.
Trends in steganography research have advanced past finding new bits to twiddle in communications streams, focusing instead on the enemy and asking important theoretical questions: Can a steganographer use information theory to provide an upper bound on an adversary’s powers? (Yes.) Must the number of steganographic changes be proportional to the size of the hidden message? (Surprisingly, no.) Can a researcher say something about secure capacity relative to the size of the cover? (Yes.) And can a steganographer identify parts of the cover medium that are better to hide in? (Yes, but at the moment only heuristically.) Answers lead to further fascinating game-theoretic questions: If a steganographer embeds secret information only in the "best" places, the enemy would likely look only there and catch it out, so steganographer and steganalyst alike should randomize their behavior. The questions also represent today’s most interesting research challenges.
Andrew D. Ker, Oxford, U.K.
Institutionalize Software Ethics
While I support and applaud Arvind Narayanan’s and Shannon Vallor’s Viewpoint "Why Software Engineering Courses Should Include Ethics Coverage" (Mar. 2014), and even if all software engineering educators would agree, the ethical challenges confronting software engineers who have already completed their formal educations would still go unaddressed. How can ethical reasoning and behavior be encouraged if only some software engineers have some kind of formal education in ethics? One solution is to embrace basic professionalization. However, a lack of professional status and ethics education does not absolve any IT employee, not only software engineers, of ethical responsibility, given IT’s integral role in society, as the authors outlined in their column.
I previously wrote1 organizations that wish to build strong ethical cultures would do well to employ IT ethics officers as part of their organizationwide ethics programs. The position should have knowledge of IT, business, and ethics and take on several important responsibilities:
Train employees. Develop IT-specific training programs for IT and non-IT employees alike;
Evaluate proposals. Evaluate IT project proposals for ethical risk, recommending mitigation measures;
Develop policies. Help develop ethics policies and integrate ethical standards into organizational procedures pertaining to IT; and
Be an ethics coach. Function as a confidential resource for IT employees facing ethical dilemmas.
This approach would help organizations and their employees bear up under government and public scrutiny and pressure to conduct business ethically vis-à-vis IT-enabled systems, from information security to personal privacy to health care. Though critics might view it as just another layer of red tape, creating an ethical culture should no longer be the concern of only a few IT professionals and organizations.
Shana Ponelis, Milwaukee, WI
Join the Discussion (0)
Become a Member or Sign In to Post a Comment