The impact of articles is shown by how many times it has been cited after its publication. Most measures focus on citations within the scholarly literature.
Altmetrics (see bottom of page) tracks citations and mentions in alternative venues such as popular sources, news articles, Twitter.
Your choice of which measure to use will be determined by what type of engagement you would like to show.
The most common way to demonstrate the impact of an article is to use a 'cited by' count to show how many subsequent pieces of research cite that article.
There are three providers of this information - Web of Science, Scopus, and Google Scholar. The citation count for each database will diffe,r because the scope and coverage of each is different. Choose one source and make sure to cite it in your CV alongside the citation count. (example: "Cited 20 times in Web of Science"). For official purposes, using Google Scholar numbers is not recommended because those counts can include self-citations and shadow citations and may be over inflated.
In each database search using the title of the article and click the 'Cited By' button to see the number of times the article has been cited after publication.
The Field Normalized Citation Impact (FNCI) is the ratio between the actual citations received by a publication and the average number of citations received by all other similar publications. It is meant to correct for the different disciplinary patterns of scholarly communication and publication age can have on non-normalized metrics, such as citation counts. (text from the Metrics Toolkit)
The Relative Citation Ratio is a field-normalized metric that shows the scientific influence of one or more articles relative to the average NIH-funded paper. It can be found by using the iCite tool from the National Institutes of Health.
Altmetrics are measure of research impact outside of the traditional sources, it includes: Facebook posts, tweets, blog posts, Wikipedia entries, online multimedia, and more. This is a measure primarily provided by the Altmetric Company but a competitor Plum Analytics has become more available in recent years.
Examples of journals and databases that integrate altmetric analytics into their publications are: BioMed Central, EBSCOhost, Elsevier, Frontiers Media, the JAMA Network, Karger, Michigan Publishing, Nature Publishing Group, Public Library of Science (PLoS), Scopus, Summon (ProQuest), Taylor & Francis, EBSCO Discovery Service, Weave: Journal of Library User Experience, and Wiley.
Altmetrics are an interesting way to demonstrate the reach and engagement with your work outside of traditional scholarly metrics. Depending on the discipline and audience of the work they can be useful. One major factor to consider is that social media engagement can be bought and sold (often through the use of fake bot-based accounts on platforms like Twitter and Facebook) which can inflate recorded engagement.
Altmetrics has a program where academic researchers doing a one-time project can apply for free access to Altmetric data. More information as well as the terms and conditions for gaining access to the data can be found on Altmetrics Researcher Data Access Program webpage.