Ads by Google

Saturday, June 4, 2011

Why aren't Firms doing a cost benefit analysis on Data theft?


After reading a lot of recent news on phishing and cracking attacks on high profile firms, I keep wondering whether anything is being attempted at all in the security front all. I mean,you're on the WWW, there are a broad spectrum of people who are for/against/indifferent to you.  And if you do something perceived as unpopular, you're inviting some form of protests, legitimate or not. Inevitably, the firm's site is cracked and a whole lot of really, really sensitive information gets leaked and then there is much grovelling and PR.

Is it still that companies are still going through the popular 'security theatre'?
Virus software Check
Firewall Check
RSA token Check
ACL software check

and that's it?

And oh 'It can't happen to us/me' syndrome?
Yes,Yes, I get the usefulness of the above softwares and how they raise the  bar on cracking and all that but it all seems so pointless when the actual methods of cracking are revealed, isn't it?

Why aren't the firms looking at cost benefit analysis on the loss of data before doing any securing of the data?  I mean, if you're looking at a Credit Card database, wouldn't a worst case planning of complete compromise of the same be planned and mitigation steps planned for the same?  Multistep authorisations, access control, manual verification, disabling remote access for certain operations, aren't they supposed to be done for securing such data? I find it hard to wrap my head on the entire credit card databases being whacked;  I can understand a single card holder account compromised due to social engineering tricks but entire card databases?  How?  It boggles the mind.

Wouldn't one at least check the cost of compromise of the database? i.e. we'd lose X millions in sales and revenue if this get leaked along with the bad PR and legal issues pertaining to card data losses and intimation to individual users and hence we'd need to make sure we have the above security checks and processes in place? Shouldn't the expected data loss cost be a factor in making additional investments in terms of money, time and processes to make sure the unthinkable does not happen?  And application teams and project managers deploying things would probably think about security from the ground up rather than treating it as something the infrastructure guys would help with before deployment.

And shouldn't they pick up best practices from the casinos?  Of course, I realise they work with physical money more rather than electronic stuff but they seem to be doing a good job in making sure they don't come out red faced that often with so much money involved.  And they seem better at figuring out Insider threats and have enough checks and balances to catch them?  I mention this because (apparently) most data losses seem to stem from insiders doing it and/or providing the information to external parties under duress, carelessness or otherwise.

The downside of the litany of compromises is that, there will be legislation and laws that are not going make it easy to do business.  In India, we seem to have that started with the central bank insisting mobile/internet payments in certain cases be done through a 2 step process.  I currently have to do that now for paying my cellphone bill through the carrier's mob app. I pay through the app and then I get an SMS that outlines how I will have get in touch with the bank payment gateway, get a one time code and send that as an SMS again to the carrier.
 

Monday, May 30, 2011

AucTeX Tip: Automatically save file before compiling

It used to be irritating for me that I had to save the file when I hit C-C C-C when compiling the file.  It tends to break my flow of work when I had to hit 'y' when Emacs queries me to save the file before compiling.

No  more.

Asking on the Auctex mailing list, the answer turned out to be a simple
(setq TeX-save-query nil) ;;autosave before compiling

customisation to my .emacs file.
It saves me a few keystrokes of C-x C-s too in the event I forget to regularly save my file.  Hopefully, this is worth it for you too.

Tuesday, May 24, 2011

A simple way to extract specific PDF pages

Today, I received a humongous PDF with about 300 pages of documentation which had to be shared with lots of people who had to review each section independently.  Instead of simply forwarding the entire document to them and asking them to wade through it themselves, I thought I'd split the pages out and send only the relevant bits.

That should be easy, right?

Well, I forgot what the tool was.  A few minutes of google search turned up...pdftk which was what I was looking for.  Turned out that I had installed it long time ago and when I tried it, it dumped core on the cygwin installation I had.

This happens to me.  A lot. Just when I have deadline and I think of the solution, the carpet gets pulled under me.  :-)

Wait, I did remember doing something using LaTeX and another quick search revealed pdfpages on CTAN.  Downloaded and installed it, read the documentation and it was  a breeze to get things sorted.  The smallest example that I can create to get a specific set of pages is shown below.

\documentclass[a4paper]{scrartcl}
\usepackage{pdfpages}
\begin{document}
\includepdf[pages={ 9-14,27}]{RFP.pdf}
\end{document}
%%% Local Variables:
%%% mode: latex
%%% TeX-master: t
%%% End:


That's it.  LaTeXing the file gave me just the pages I needed.  If you have a TeX installation, this works for most cases.  Please read the documentation if you want to something fancy but the above is enough to get the pages you need.

Sunday, May 22, 2011

Fitting long TOCs into a single Beamer frame

Here's a simple way to fit a long TOC into a single frame instead of making it flow into multiple frames.  This might be useful for those preparing long lectures with beamer.

Monday, May 2, 2011

Automatic Screenshot insertion in Org Mode

Trawling the org-mode mailing list , found another interesting hack on getting screenshots into your org notes.  The thread on what was originally asked is here, where the OP wanted to go from the clipboard into a named file.  One of the org members posted this reply along with the link to the elisp code.

Works on Unix-like systems with Imagemagick installed.

No Gnus v0.17 Released

Happened on May 1st.  Biggest changes in gnus-registry, if you use it at all, along with minor bug fixes.  The snapshot can be downloaded here.

Sunday, May 1, 2011

Integrating Emacs, Org mode and Google Calendar

Well, sort of.

This thread on on gnu.emacs.help is interesting in that it can allow you to work with your google calendar and org-mode after daisy chaining shell scripts and using Googlecl.  While this may not be a perfect solution, if you're willing to spend time tinkering with all the tools and scripts, it shouldn't be too difficult.

Monday, April 25, 2011

Gnus Topics

For those of you who are new to Gnus, it might be worth checking out the Gnus Topics.  It's simple as hitting t in the Group Buffer, which is a toggling command.  With the menu bar having the the Topic commands, it's easy to organise your groups in different categories.  You can move and copy groups and arrange them in any way you want.  The learning overhead is too small for you to not check it out.

Friday, March 11, 2011

Emacs 23.3 released

Emacs 23.3 is now officially released and it is available at the usual places. The NEWS file has the new changes for what is essentially a bug fix release.

Thursday, March 3, 2011

Emacs 23.3 Release Candidate available

Unless there are bugs to be fixed, the release candidate will become the final Emacs 23.3.  Announcement here and Windows binaries here.