OSINT & Zombie Journals–Part 5

To improve your evaluation skills, develop three abilities:

  1. Maintain a skeptical mind-set.
  2. Learn which sources are most trustworthy.
  3. Learn to identify reporting errors and inconsistencies. If something does not look right, investigate its veracity.

In addition, the skeptical investigator must develop a structured means of evaluating the relevance and reliability of the collected data along with the ability to communicate this. In my opinion, this is the most demanding part of any investigation.

The following list of 13 evaluation criteria has developed over decades of practice by researchers and investigators worldwide. This list of evaluation criteria appears in one form or another throughout the literature on research and intelligence analysis. I use a Microsoft Excel spread sheet based upon them to record the evaluation of sources for most investigations. Doing this is time-consuming, but it is often necessary to maintain the integrity of the reported data and the conclusions that are drawn from it.

The Internet is renowned for harbouring unreliable information. The following evaluation matrix will work for you if you rigorously apply it.

Evaluation Matrix

  1. Recency.Do the data appear to be current on the subject or the most appropriate for the historical time period? Are sources dated, and maintained?
  2. Relevancy. Is there a direct correlation to the subject? What is the tone of the information display? Is it popular, scholarly, or technical?
  3. Authority. What is the reputation of the data, and the data-provider? Has this source of data been cited elsewhere? What is the reliability of the source? How can you document or qualify the source of the information?
  4. Completeness. Are alternative data or views cited? Are references provided? Given what you know about the source, is there any evidence that the data is NOT ‘slanted’?
  5. Accuracy. Does the source furnish background information about data sources and/or in-­depth data? Are the complex issues of data integrity and validity oversimplified? Are the terms adequately defined? Are there references or sources of quotes?
  6. Clarity. Is the presentation of information really credible? Can bias of the information providers really be ruled out? Are there any logical fallacies in presentation of the data or assertions, or in other statements or publications on the page or by the source? Are key assumptions described, or are they hidden in the hope that the reader will be gullible?
  7. Verifiability. Can the information be verified? Can you find corroboration? If not, why not?
  8. Statistical validity. Can the key points or critical data be supported by standard statistical testing? With subjective information, one verifies by corroboration as it can be found. With numerical information, many questions arise. What statistical inference is needed in order to accept any implied inferences of the data displayed? Are there clear explanations for readers or viewers to qualify the implications of numerical “averages” or “percentages?”
  9. Internal consistency. Do the data or commentary contain internal contradictions? Know what you can about the source, and scan for logical fallacies throughout the presentation.
  10. External consistency. Do the data reflect any contradictions among the source documents? In the assertion of information or views, is there an acknowledgment or recognition of alternative views or sources? If not, and source documents are involved, one might suspect that the author had an “agenda”.
  11. Context. Can fact be distinguished from opinion? Are sources taken, or used out of context?
  12. Comparative quality. Are some data clearly inferior to other data? Which are the “best” data when you consider the above eleven tests ( i.e., most recent, most relevant, the most authoritative, the most complete, the most accurate and so forth). You should always be evaluating the information as you browse the Net. Check the little things that journalists watch for. A misspelled name, for example, could be a warning sign, even in an academic paper, that the author was careless in other areas as well. Do any statements seem exaggerated? If so, why has the author exaggerated? Is it a spur­-of-­the­moment statement by e­mail, or is that exaggeration more deliberate? Are you reading instant analysis, quick off the mark, or the results of a carefully crafted study, or a meta­analysis, which is a study of all previous studies on a subject? What do you think has been left out of  the report? What an author omits may be just as important to a researcher as what an author includes. What’s left out could reveal much about the bias of the information you are reading. Take notes on such matters as you find sources, and possibly include footnotes in your paper, if that’s what you’re doing; it’ll provide evidence that you have critical thinking abilities! Don’t ignore bias as a valuable source of information, as well: Even an untrustworthy source is valuable for what it reveals about the personality of an author, especially if he or she is an actor in the events.
  13. Problems.There’s another problem, growing in the 1990s, called by some the Crossfire Syndrome, after Crossfire, the CNN public affairs show and its tabloid television imitators. The Crossfire Syndrome drowns out the moderate voices in favor of polarization and polemics. Confrontation between polar opposites may make for good television, but it often paints a distorted view of reality. On the Net, and throughout North American media culture, the Crossfire Syndrome is acute. In flame wars, the shouters on both sides are left to use up bandwidth while the moderate voices back off. Political correctness of any sort, right or left, religious or secular, tends to distort material while creating interest, at the cost of ‘truth’. All of us are left with the task, with nearly any information we’re exposed to, of seeking out the facts behind the bias. It’s been said many times that there is no such thing as objectivity. The honest researcher tries to be fair. The best researcher is both prosecutor and defense attorney, searching out the facts on all sides (there are often more than two).

The advent of the 24-hour all-news network, along with a procession of imitators and the Web. The CNN Effect lives off of one continually asked question, ‘how bad can it get?’ as a way to maintain viewer interest. This severely distorts the information presented.

A Test

If you want to test what you have learned here is an example.

You are researching how IT operations impact carbon dioxide emissions. In the data you uncover, you find the following journal article: Towards Green CommunıcatıonsWhat would your evaluation of this article be?

OSINT & Zombie Journals–Part 4

Always keep in mind that, as the old pessimist philosopher Arnold Schopenhauer stated, “The truth will set you free–but first it will make you miserable.”

The RICE Method of Evaluation

Use the RICE method can help you decide on how to respond to information or intelligence:

  • R for reliability. The basic truthfulness or accuracy of the information you are evaluating.
  • I for the importance of the data based upon its relevance
  • C for the cost of the data (and your possible reactions or actions relating to the information)
  • E for the effectiveness of your actions if based upon this information. Would actions based upon this information solve the problems you face?

This evaluation format is useful for summarizing collected data and for analyzing how you might apply the data in a broad range of situations. However, it does not delve into specific metrics used to evaluate a particular piece of information. The next article will provide 13 evaluation metrics to help you do that.

The term metrics means in this context:  a system of related measures that facilitates the quantification of some particular characteristic.


Reliability is the basic truthfulness or accuracy of the information. When evaluating data from journals you may not have the technical knowledge to evaluate the content itself. However, you do have the ability to compare the date of publication to the ownership of the journal. You also have the ability to compile a list of the author’s previous journal articles, their date of publication, the names of the journals and the publisher or owner of the journal. You can also identify conferences that the author attended or at which he was a speaker. You can then determine who ran or supported these conferences.

Predatory publishers often create or sponsor conferences to showcase their authors who may pay to be published. The conference may be part of a paid for publication package. Sometimes, they advertise a conference and take in attendance fees, and then the conference never occurs.

It is sometimes difficult to link a conference to a predatory publisher. You have to look at the page source code and the conference domain registrations and the list of speakers to tie the conference to a predatory publisher.


Cost is an important factor to consider from two perspectives. First, the obvious cost/benefit relationship or more precisely, the question, is this worth the cost? Second, if the data seem too good to be offered free, then you have to ask, why is this free?

As you now know, open access journals don’t make their money from subscriptions. You must uncover and then evaluate the relationship between how the journal makes its money and the authors who produce the content. Of course, this evaluation metric is not unique to academic and open access journals.

Part 5 will discuss a more detailed evaluation matrix.

OSINT & Zombie Journals—Part 3

Understanding the Need for Evaluation Methods & Citations

The basis of your evaluation method is understanding that you cannot know what you don’t know. However, as you come to terms with this, you can guard against the perils of the Dunning-Kruger Effect.

Dunning-Kruger Effect

There is a saying that “you cannot know what you do not know”. This might seem redundant, but it is also true as it might be impossible to identify gaps in our own knowledge. In other words, you cannot teach yourself what you do not know. Without instruction and training, you are very likely to think that you do, in fact, know “everything” you need to know, when you actually do not have the ability to recognize your mistakes – you are unconsciously incompetent. David Dunning and Justin Kruger first tested this phenomenon in a series of experiments in 1999[1].

Typically, the unskilled rate their ability as above average, much higher than it actually is, while the highly skilled underrate their abilities. Confidence is no substitute for skill and knowledge which must be used with confidence to ensure a positive outcome.

The Dunning-Kruger Effect may inhibit proper evaluation of the collected data without an established evaluation method. This article, and those that follow, should help you adopt an effective evaluation process.

Developing the necessary skills and knowledge is not ‘rocket science’, it is ‘time in grade’. You must simply do it, study how to do it better, and network with people who do it. This process takes years of effort but do not give up. I have been doing this type of research for 40 years and I am still learning. Now let’s set about reducing the Dunning-Kruger Effect.

Before beginning the evaluation of the collected data itself, the investigator must prepare accurate citations as the starting point for evaluation. The citation quantifies significant attributes of the data and its source.


A citation is a reference to a published or unpublished source though not always the original source.

Citations uphold intellectual honesty and avoid plagiarism. They provide attribution to the work and ideas of other people while allowing the reader to weigh the relevance and validity of the source material that the investigator employed.

Regardless of the citation style used, it must include the author(s), date of publication, title, and page numbers. Citations should also include any unique identifiers relevant to the type of material referenced.

The citation style you adopt will depend upon your clientele and the material being reported. If the report will include many citations, you should discuss the issue of citation style with your client before producing the report and your client should be familiar with that style, if at all possible.

Never use footnotes or endnotes for anecdotal information. This avoids having something masquerading as a citation of a source that only provides supplementary information. Supplementary information belongs in the body of the report where it is identified as such.

While doing OSINT, you might find a document from an organization that changes its name before you finish your report. In that case, the document was retrieved before the name change. How do you cite reference? Do you cite it with the old organization name or the new name?

Normal practice is to use the name as it was when you found the document, however, this can cause problems when someone does fact-checking to independently verify the citation. Someone must then find and document the history of the organization name.

The solution is to cite the date the document was retrieved and in square brackets include the new name, for example, [currently, XTS Organization] or better still, [as of 11 Jan 13 the name changed to XTS Organization]. The latter addition to the citation creates a dated history of the organization’s name. The dated history of a journal and its publisher is of critical importance when dealing with journals that die and come back as zombies. It is wise to check Jeffery Bealls’s list of predatory publishers while preparing citations. It is also wise to state when this list was checked in a footnote or in the actual citation as I now do.

Of course, all Web citations must include the date on which the URL was visited for the purpose it is being cited.

Bibliographic Databases

The large bibliographic abstract and citation databases are secondary sources that merely collect journal article abstracts and journal titles without much, or any, vetting of the article or journal.

Elsevier’s Scopus  is one such service, another is the Thompson Reuters Master Journal List. Do not consider either an authoritative source of quality journals or abstracts. Both contain numerous low-quality journals produced by predatory publishers.

Lars Bjørnshauge founded an online index open-access journals in 2003 with 300. Over the next decade, the open-access publishing market exploded. By 2014, the Directory of Open Access Journals (DOAJ) now operated by the non-profit company, IS4OA, had almost 10,000 journals. Today its main problem is not finding new publications to include, but keeping the predatory publishers out.

In 2014, following criticism of its quality-control process, DOAJ began asking all of the journals to reapply based on a stricter inclusion criterion in hopes of weeding-out predatory publishers. However, the question remains, how does DOAJ determine if the publisher is lying?

Attempts to create a ‘whitelist’ of journals seems doomed to failure, especially when attempted by a non-profit using volunteers. Most researchers will judge a journal’s quality by its inclusion in major citation databases, such as Elsevier’s Scopus index, rather than the DOAJ’s list. As you can see, Scopus and the Thompson Reuters Master Journal List are also vulnerable to manipulation by unscrupulous publishers.

Predatory publishers have realised that these lists offer a very low barrier to entry, especially in certain categories. In addition, as such databases are usually subscription services, some publishers advertise certain authors using fake citations supposedly from bibliographic databases knowing that certain commercially valuable demographics never verify these citations.

[1] Kruger, Justin; Dunning, David (1999). “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments”. Journal of Personality and Social Psychology, Vol 77(6), Dec 1999, 1121-1134.  http://psycnet.apa.org/?&fa=main.doiLanding&doi=10.1037/0022-3514.77.6.1121 (3 Oct 2016).

A Brief History of Open Source Intelligence

An article with the above title appeared on the bellingcat site.

It is an excellent article even if I don’t agree that OSINT went into hibernation after WW2. For example, from after WW2 until the Cold War ended, in the US, the Foreign Broadcast Information Service, which is now the Open Source Enterprise and in the UK the BBC Monitoring Service trawled the airwaves, and other open sources regularly publishing transcripts and analysis of what they heard, starting after the war and continues today. There are many other examples.

On the other hand, today’s OSINT is highly influenced by a convergence of technologies. The market penetration of smartphones with 3G connections and the popularity of social media sites is one such convergence of technologies that produces raw data. The other convergence of technology is the availability of inexpensive software and computer hardware to process the raw data for analysis.

OSINT & Zombie Journals—Part 2

The Nature of Sources

Primary & Secondary Sources

An archive is a primary source because the contents are documents usually authored by a person with direct knowledge of the topic; this includes public records completed by the subject.

A library is a secondary source because its documents are created from the primary sources, as are citations, abstracts,  bibliographic databases, etc..

Authoritative Sources

Evaluating the quality of a source is to ask questions like:

  • What is the reputation of the data, and the data-provider (including the publisher)?
  • Has this source of data been cited elsewhere?
  • What is the reliability of the source?
  • How can the source of the information be documented or qualified?
  • Is this a primary source or secondary source?
  • Is this a legally required or legally binding source?

Answers to the above questions should help you find the authoritative source. Zombies are never authoritative sources.

In the next article I will discuss evaluation methods, citations, and bibliographic databases.

OSINT & Zombie Journals—Part 1

Many scholarly journals are being bought-up by predatory publishers that turn once prestigious journals into publications full of junk science. Usually these publishers turn their acquisitions into free ‘open access’ publications on the Internet that are full of typos, inaccuracies, and even outright fabrications.

One such online publisher, the OMICS Group, is being sued by the U.S. Federal Trade Commission for deceptive practices that include spam emails to solicit articles that are not peer reviewed. This same outfit recently acquired two Canadian medical journal publishers.

From the researcher’s perspective, the most deceptive practice of these free open access journals is the fact that authors pay to have their articles published. The second deceptive practice, according to the FTC, is that such publishers falsely state that their journals are widely cited and included in academic databases. To the contrary, the FTC states that PubMed does not include any of the OMICS titles. The FTC also alleges that the work of authors is sometimes held hostage for payment of undisclosed fees.

When Jeffery Beall, an academic librarian at the University of Colorado, started compiling his list of predatory publishers, he found only 18—that was in 2010. Today, his list has over 1000 publishers.

When a predatory publisher acquires a journal, it ceases to be a scholarly journal and only lives on as something exploited for profit. Such an acquisition ends proper peer review. The journal becomes a zombie.

For the researcher conducting a literature review, the additional time and effort required to vet every article and citation to eliminate zombie journals has increased to nearly unbearable levels. Of course, this is part of the zombie strategy to flood the scholarly journal space with purulent, infectious zombies to kill-off real journals.

Zombie publications are a rising issue for serious researchers. The quality of a literature review affects the quality of the decisions based upon this collected data.

This series of articles is about recognising and avoiding open-source junk. These five articles should help you develop the evaluation skills and processes necessary to avoid falling victim to zombie journals and other forms of diseased data that infects the open-source domain.

Critical Thinking

Growing up, my mother always told me not to believe everything I read. This was good advice, but it doesn’t go far enough.

Critical thinking is an ancient concept but the actual term began to appear in the mid-20th century. In the information age, developing this skill is essential. It is an intellectually disciplined process of actively analyzing and evaluating information. It transcends all subjects, sources, or problems. Critical thinking protects us from biased, distorted, partial, uninformed, or prejudiced content and ideas. It insulates us from improper assumptions and implications. It prevents undesirable consequences.

Critical thinking is not the application of logic for selfish purposes. Selfishness often appears under the guise of critical thought to skilfully manipulate ideas to promote a vested interest. Fortunately, this usually becomes apparent upon close examination because selfishness typically accompanies lies and an intellectually flawed argument. Examining the issue fair-mindedly, and with true intellectual integrity, the selfish analysis falls apart. Of course, the selfish minded individuals will call the product of true critical thinking idealism, using that term in a pejorative sense, thus further identifying their selfish motives.

The End of Dialog

The database aggregator, Dialog, is no more. It was consumed by ProQuest to become ProQuest Dialog. The resulting product has become completely useless to us for due diligence and corporate research.

The Standard & Poor’s and Corporate Affiliation databases are gone along with several others that we relied upon to create a basic profile of a company’s structure and operations. Alternatives exist, but none are as convenient as the old Dialog.

Getting Advance Knowledge of New Products

Companies operating in the U.S. often file ‘Intent-To-Use’ applications for trademarks and thereby disclose the names and descriptions of forthcoming products and services six months before the product launch. Extensions of up to two years are sometimes granted if the launch process becomes bogged down.

Searching the Trademark Electronic Search System (TESS) of the U.S. Patent & Trademark Office will find the ‘Intent-To-Use’ applications.

Business Interrupted

Managers sometimes tie themselves into knots worrying about the risk or threat rather than analysing the impact of interrupted business processes. My advice is to stop fretting about the cause and concentrate on alleviating the impact of the interrupted business processes.

To do this, defeat the problem in detail as follows:

  • Decide which processes are critical and which are not.
  • Determine how long any particular process can be interrupted before it’s loss become detrimental to operations, profitability, and customer satisfaction.
  • Design a plan of action to determine if the disruption will continue beyond the tolerable time limit.
  • Have a plan to replace each missing process.
  • Plan for the concurrent loss of several critical processes.

The key to a successful business continuity plan is concentrating on the critical day-to-day operations.

How does this relate to investigtion and research? The answer is quite simple:

  • Have you ever done a security survey?
  • Have you ever done a competitor SWOT analysis?
  • Have you ever done due diligence on a critial supplier?

Getting Out of Google

Google and other search engines are wonderful things for gathering information, we all know that, but what if people with evil intent are gathering information about you?

Getting out of Street View

Google Street View provides a great deal of data that can be used to plan an attack on a facility, a person, or to conduct a kidnapping. Google offers an easy, free, and effective way to restrict access to this data.

At a client’s home, I found that his car licence plate was legible. This usually occurs when the car is parked inside a garage or car port. At the client’s workplace, several security measures were clearly visible as were other features of the facility that raised concerns.

Google’s solution is to place an opaque digital wall around your house or facility. To get out of Google Street View, first search for the street address. Once the property is visible, you will find a small box at the bottom right of the image that says “Report a problem”. Click on this to select a reason for blurring the image of the property. I usually select Other: This image presents security concerns. Add some discriptive data to help Google identify the property and complete the CAPTCHA (an acronym for “Completely Automated Public Turing test to tell Computers and Humans Apart”) thing that takes me several tries to get right. In 2 or 3 days a blurred wall should appear around the property.

Hazardous Material

The Emergency Response Guidebook published jointly by the Canadian Department of Transportation, Mexican Transportation agencies, and the USDOT lets you identify the hazardous contents of pipelines, trucks, or trains from the placards on the side of the tanker, rail car, or pipeline. The guide lists specific hazards and evacuation distances for spills or fires. However, it doesn’t provide any spill/fire/explosion protocols.

If you are around hazardous materials and their transport conveyances then you need this guidebook.

Bulk Sales & the PI


Sales of large quantities of stock or the sale of assets and equipment of the business itself outside the regular course of business are considered a sale “in bulk”. The Bulk Sales Act is designed to protect the creditors of a business owner by requiring the owner to follow the procedures of the Act for sales outside the regular course of business.

If a buyer wishes to purchase the assets and equipment of a business, the seller “in bulk” must provide an affidavit stating that all creditors have been paid, or they will be paid from the proceeds of the sale.  In some cases the buyer pays an assigned trustee and creditors of the business may wish to waive their rights in which case the proceeds are paid.

A ‘Bulk Sales search‘ determines if a bulk sales affidavit has been filed with the relevant Ontario Superior Court of Justice office.

The Private Investigator (PI)

If you are interested in an Ontario business’s assets, debts, cash flow, and general financial condition, then a a Bulk Sale Act search is an important search.  It may tell you if the business is failing or if it has suffered a set-back.  You may learn of an abandoned line or the sale of a production facility.  You may learn of a legal action in another jurisdiction by contacting or researching the other parties to the bulk sale. Any sale that indicates that creditors will be paid from the proceeds of the sale may indicate a judgment that is being satisfied or it may be part of the settlement of a claim.


Operational Risk & Lawfare

Recently, I have been involved in a series of jobs involving Operational Risk.

Operational Risk arises from:

  • inadequate or failed processes and controls,
  • people
  • systems
  • external events
  • contractual obligations
  • compliance issues
  • lawfare

Lawfare is the most interesting aspect of this type of work. Lawfare is a form of asymmetric warfare that is waged via the courts with the intention of damaging the firm. Special interest groups, radicals, and competitors will use this to create financial damage and create ill will towards the targeted company.

The Investigator’s task is usually to identify the funding sources and relationship of the plaintiff to individuals and groups who would benefit from the use of this tactic.

The News and Critical Thought

“Competitive Intelligence means looking behind the news and doing an analysis to find the truth. That is not the role of newspapers. Their role is simple: to sell and make profits for their owners. If that means subjective reporting, then so be it.”

Read more