Understanding the Need for Evaluation Methods & Citations
The basis of your evaluation method is understanding that you cannot know what you don’t know. However, as you come to terms with this, you can guard against the perils of the Dunning-Kruger Effect.
There is a saying that “you cannot know what you do not know”. This might seem redundant, but it is also true as it might be impossible to identify gaps in our own knowledge. In other words, you cannot teach yourself what you do not know. Without instruction and training, you are very likely to think that you do, in fact, know “everything” you need to know, when you actually do not have the ability to recognize your mistakes – you are unconsciously incompetent. David Dunning and Justin Kruger first tested this phenomenon in a series of experiments in 1999.
Typically, the unskilled rate their ability as above average, much higher than it actually is, while the highly skilled underrate their abilities. Confidence is no substitute for skill and knowledge which must be used with confidence to ensure a positive outcome.
The Dunning-Kruger Effect may inhibit proper evaluation of the collected data without an established evaluation method. This article, and those that follow, should help you adopt an effective evaluation process.
Developing the necessary skills and knowledge is not ‘rocket science’, it is ‘time in grade’. You must simply do it, study how to do it better, and network with people who do it. This process takes years of effort but do not give up. I have been doing this type of research for 40 years and I am still learning. Now let’s set about reducing the Dunning-Kruger Effect.
Before beginning the evaluation of the collected data itself, the investigator must prepare accurate citations as the starting point for evaluation. The citation quantifies significant attributes of the data and its source.
A citation is a reference to a published or unpublished source though not always the original source.
Citations uphold intellectual honesty and avoid plagiarism. They provide attribution to the work and ideas of other people while allowing the reader to weigh the relevance and validity of the source material that the investigator employed.
Regardless of the citation style used, it must include the author(s), date of publication, title, and page numbers. Citations should also include any unique identifiers relevant to the type of material referenced.
The citation style you adopt will depend upon your clientele and the material being reported. If the report will include many citations, you should discuss the issue of citation style with your client before producing the report and your client should be familiar with that style, if at all possible.
Never use footnotes or endnotes for anecdotal information. This avoids having something masquerading as a citation of a source that only provides supplementary information. Supplementary information belongs in the body of the report where it is identified as such.
While doing OSINT, you might find a document from an organization that changes its name before you finish your report. In that case, the document was retrieved before the name change. How do you cite reference? Do you cite it with the old organization name or the new name?
Normal practice is to use the name as it was when you found the document, however, this can cause problems when someone does fact-checking to independently verify the citation. Someone must then find and document the history of the organization name.
The solution is to cite the date the document was retrieved and in square brackets include the new name, for example, [currently, XTS Organization] or better still, [as of 11 Jan 13 the name changed to XTS Organization]. The latter addition to the citation creates a dated history of the organization’s name. The dated history of a journal and its publisher is of critical importance when dealing with journals that die and come back as zombies. It is wise to check Jeffery Bealls’s list of predatory publishers while preparing citations. It is also wise to state when this list was checked in a footnote or in the actual citation as I now do.
Of course, all Web citations must include the date on which the URL was visited for the purpose it is being cited.
The large bibliographic abstract and citation databases are secondary sources that merely collect journal article abstracts and journal titles without much, or any, vetting of the article or journal.
Elsevier’s Scopus is one such service, another is the Thompson Reuters Master Journal List. Do not consider either an authoritative source of quality journals or abstracts. Both contain numerous low-quality journals produced by predatory publishers.
Lars Bjørnshauge founded an online index open-access journals in 2003 with 300. Over the next decade, the open-access publishing market exploded. By 2014, the Directory of Open Access Journals (DOAJ) now operated by the non-profit company, IS4OA, had almost 10,000 journals. Today its main problem is not finding new publications to include, but keeping the predatory publishers out.
In 2014, following criticism of its quality-control process, DOAJ began asking all of the journals to reapply based on a stricter inclusion criterion in hopes of weeding-out predatory publishers. However, the question remains, how does DOAJ determine if the publisher is lying?
Attempts to create a ‘whitelist’ of journals seems doomed to failure, especially when attempted by a non-profit using volunteers. Most researchers will judge a journal’s quality by its inclusion in major citation databases, such as Elsevier’s Scopus index, rather than the DOAJ’s list. As you can see, Scopus and the Thompson Reuters Master Journal List are also vulnerable to manipulation by unscrupulous publishers.
Predatory publishers have realised that these lists offer a very low barrier to entry, especially in certain categories. In addition, as such databases are usually subscription services, some publishers advertise certain authors using fake citations supposedly from bibliographic databases knowing that certain commercially valuable demographics never verify these citations.
 Kruger, Justin; Dunning, David (1999). “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments”. Journal of Personality and Social Psychology, Vol 77(6), Dec 1999, 1121-1134. http://psycnet.apa.org/?&fa=main.doiLanding&doi=10.1037/0022-35184.108.40.2061 (3 Oct 2016).