Content Enhancement & Knowledge Services
Client Speak
"Let me thank you again for the professional way in which you are managing the project.."
A Multinational Media and Information Company
"We express our extreme satisfaction with Scope’s exceptionally high and consistent quality.."
A Leading Chemical Information Provider, USA
"We really enjoy working with Scope’s team. We like to think of our suppliers as partners in our mission and Scope has been a great partner.."
An International Standards Organization
"Scopes abstracts and keywords we received look great, and they adhere to our guidelines throughout.."
A Global Academic Publishing Company
"Very happy with the files you have sent so far. We have not had any issues with the content.."
A Life Science Membership Organization, USA

Metadata Services

Scope has rich experience in collection, validation / updating, enrichment, tagging and classification of metadata based on standards. Over the past ten years, Scope has handled various assignments on metadata services for publishers and information providers and has processed over thirty million records.

Types of Metadata

The various types of metadata handled by Scope are Bibliographic Metadata (Title, Author, Publisher, ISSN, ISNI, etc.), Commercial Metadata (Price, Discount, etc.), Subject Metadata (BISAC, LCH, DDC, MeSH, etc.), Marketing Metadata (Content Review, Author bio, etc.) and Administrative Metadata (Copyright, Reproduction Rights, License, Perpetual Access, etc.)

Scope has immense experience / expertise in handling the above metadata types which are usually available in standards / formats like ONIX, KBART and MARC.

Scope’s Metadata Services includes :

  • Metadata creation / collection – Collection / creation of all types of metadata for a given bibliographic record through assisted automation approach. The level of automation will be decided based on the source format (structured, semi-structured / unstructured). Scope applies its unique “techno-human” approach that blends the use of appropriate technology with subject domain expertise. Scope has the capability to handle high volume projects across universal modules like Extraction, Transformation and Loading (ETL).

    Scope has a proprietary metadata extraction platform for extraction of metadata from the input files (ONIX / MARC / PDF) and creating inventory as per the requirement. This platform is customizable and can handle client specific requirements. Scope’s pre-processing module of metadata extraction platform will identify and extract the required metadata specified using natural language processing (NLP) rules and location based heuristics incase of non-editable PDFs and scanned images. The extracted content will be further reviewed by Scope’s subject matter experts (SMEs).

  • Metadata validation / updating & enrichment – Validating / updating and enrichment of the existing metadata through detailed research of publisher / publication title websites, authentic third party sites like ISSN, OCLC and library catalogs like British Library Catalog. The validation / updating & enrichment also includes reference citations as per the universal style such as APA Style, Chicago Manual of Style and MLA Style.

  • Metadata Standardization/Normalization - Standardization of bibliographic data such as journal names and subject categories is an essential pre-requisite as they are sourced from different publishers or institutions and in most cases, have duplicates, spelling errors and variants. The key standardization / normalization problems handled by Scope include varied formats in the representation of metadata (for example: abbreviations, syntactic and morphological variants) and acronyms / special characters.

  • Subject classification – Classification / Indexing of bibliographic records based on library classifications such as Dewey Decimal (DDC), Library of Congress (LC), NLM, MeSH, BISAC and BIC.

  • Content search & alerts through web crawlers – Tracking changes in the content based on the frequency and regular updating of the information through proprietary crawlers. Scope’s Data Crawler will track the changes in the content based on the frequency. Content will then be updated. All the processes are IT enabled with workflow tool to handle the client processes effectively.