TV is Alive

TV broadcast in India depends on cable or DTH. TV is viewed by 878 million Indians, and has advertising and pay revenues of Rs.72000 crores a year. There is marginal decline in TV’s position last year (2021).

Digital videos are picking up. At present the tech-media firms Disney+Hotstar and Sony LIV are both owned by the television companies. There is the rise of internet as the latest distribution technology. We witness connected TVs or internet-enabled smart TVs. Both the top end and bottom end of the market drive growth — the migration of viewers at the top end to ‘pay OTT’ or subscription-based streaming platforms and at the bottom end to free-to-air TV and free OTT. It means decline of TV on older distribution formats ( cable and DTH) and its concurrent resurrection online.

The biggest loser is cable TV. The future of TV depends of free eco-system and the rise of pay OTTs.

The GEC channels, both Hindi and regional, have good viewership. There is reasonable viewership for kids channels and news channels.

TV, no doubt, is undergoing transformation. The upper-end audience is shifting to OTTs and lower-end to free-to-air TV and free OTTs. The people in the middle remain committed to TV. It has 878 million viewers and by far remains the largest video medium in India.

In pay TV, cable is on the decline, and IPTV or internet protocol TV is on the rise. It is video delivered on a closed proprietary network — either LAN or WAN. OTT on the other hand, is delivered on the open internet.

Licensing and Merchandising

Licensing and merchandising industry in India is slow to grow. Recently, it has shown some signs of revival. It is still in a nascent stage.

As we know, Walt Disney globally is number one in licensing and merchandising. It has very successfully used its characters for secondary commercial exploitation by creating a global franchise for them. Worldwide, it has 3500 licensees. The characters are associated with toys, fashion and home, food, health and beauty, stationery and publishing.

India is a complex market. Branded merchandise is restricted to big towns and cities. The customers are price-sensitive. There is low level of customer awareness. They cannot distinguish between an original and a copy. There is rampant piracy.

Entertainment is the biggest base for licensing. Stationery and toys are the highest selling categories.

Characters like Chhota Bheem are chosen for action toys. There are quite a few Indian characters that can be leveraged. The digital boom is an incentive. Metaverse and Web 3.0 too present tremendous opportunity.

Batteries

In electro-chemistry, manganese is an effective element. It is used in higher density batteries. Or else nickel-cobalt formulation is used. These are used in smaller amounts than other elements.

The power pack batteries are prone to fires and are not considered stable, despite being able to take the vehicles further.

There are other combinations too — lithium-nickel-manganese oxide (LNMO). The use of manganese makes the battery of limited life cycle and high resistance. It means the battery gets too hot and the voltage drops. CATL’s version (and others) seem to have overcome this.

The demand for manganese is likely to be up. There is no shortage of Mn. It will be an important component for batteries. South Africa tops the list of producers, followed by China, Australia, Brazil, Gabon and India.

Media Algorithm

Of late, news is being consumed digitally — from social media by 63 per cent, from YouTube by 53 per cent and from WhatsApp by 51 per cent.

While consuming news from YouTube, the audience comes across the news content through the recommendations of algorithms deployed by the platform. These algorithms could be privileged by certain kind of news over others and there could be editorial biases in recommendations more often for certain news over other news. Thus the algorithms make the process opaque.

In WhatsApp, there is peer-to-peer (P2P) sharing of news. It could make the news viral. There is no mechanism to assess the viral trend. It thus becomes difficult to spot the fake news and to counter disinformation. The source of such viral content is difficult to trace because of end-to-end encryption.

This is a regulatory challenge — how to regulate these social media intermediaries. These platforms could block globally the entire point of view on certain issues. There are several instances of arbitrary decisions.

At the same time, broadcast media’s reach is plateauing. Thus social media become more powerful as gatekeepers. They can enhance the reach of live broadcasts and could take it to the diverse audience. The audience is fast switching to internet streaming from the conventional TV/radio broadcasts. Thus Direct-to-Mobile (D2M) is an area that will merit attention of the regulators and policy makers.

Algorithm regulation capability must be developed not only for the media/social media, but across the sectors such as finance, sciences and chemicals.

Opacity favours certain kind of political speeches over others. There are issues of neutrality of the media. There should be algorithmic regulatory framework for India.

Ed-tech : Self-regulation

There are several allegations against ed-tech sector. Ed-tech adopts aggressive selling methods. They represent the deliverables falsely. There are predatory pricing schemes. There are issues of the quality of content. They sell courses to parents by misrepresentation.

The sector is still in the process of evolution. It could be regulated by a code of conduct and certain behavioural guidelines. There is already a self- regulatory body called the India Ed-tech Consortium (IEC). It has created a two-tier grievance redressal mechanism. It resolves complaints. Member companies must have a grievance officer to resolve the complaint earlier. They are taking steps to register with the National Consumer Helpline (NCH). The body should prove its effectiveness.

The government thinks about soft-touch regulations for many upcoming sectors. The idea is to grant recognition to the ed-tech institutes and allow regulation through self-regulatory bodies.

Ed-tech has to adopt the tenets of NEP, 2020. These are affordability, accessibility, quality, equity and accountability.

In India, ed-tech startup funding stands third, after e-commerce and fintech. It has made available good teachers beyond the geographical boundaries. Ed-tech is open to hybrid experiment. It can adopt AR-VR, AI and ML to ensure effective online learning.

Video Conferencing

In the organisations, they have adopted flexible hybrid module of work, which presupposes effective video conferencing. It should be possible to start meetings anywhere with minimal steps and integrate with familiar software applications. Logitech who operates in this field offers products which are easy to set up and require no special training. Hybrid workforce must be linked to systems, devices, apps and with each other. Video conferencing devices should work seamlessly with common software applications such as Microsoft Teams, Zoom and Google Meet.

It is necessary to standardise video conferencing solutions. An ideal solution must be affordable, scalable and easy to set up and use. It should be flexible enough to meet changing business requirements.

Companies are re-considering real estate usage. Residences will become video-ready. There are studies at workplaces to establish video communications. Even sales pitches to clients can be arranged through video.

Open Source Software

Open source software is increasingly being used, especially by the startups to reduce their costs and to reduce the time taken for software development. As there are no license costs associated with open source, it becomes attractive to startups.

According to GitHub, there are 73 million users in 2021. Of this, 7.2 million users are from India, which stands a close third to the USA (13.5 million users) and China (7.6 million users). By 2023, GitHub expects 10 million Indian developers on its platform.

Open source is beneficial but it has its shortcomings. In their code bases, most of the software shows at least one vulnerability. In terms of maintenance, most of the code bases are more than four years out of date.

Open source security has made considerable advances. Still, the code bases have high risk vulnerabilities. There are outdated versions of open source components in the code bases. It is only after these are detected that steps are taken to fix them.

To overcome these shortcomings, it is necessary to use a Software Bill of Materials (SBOM) that spells out a compete inventory of the code base — open source components, version, and known vulnerabilities. SBOM helps to determine whether we are using any outdated or insecure code. Secondly, though manually it is difficult to track the vulnerabilities and patch them, automated scanning can be used to detect and patch the vulnerabilities. Thirdly, organisations must incorporate security right from the start of the development process — it is called Shift Left. A developer must follow secure coding practices at every step of development.

Building Metaverse

Metaverse, it is believed, is the next version of internet. It essentially means a transition from 2D to 3D and transition with you in it. It will be a shared virtual space with other people. There are challenges to build a metaverse which gives a feel of the physical world. There has to be a sense of physical presence among people who in reality are not in the same room. Quest 2, a VR headset, replicates real life behaviour. To illustrate, the voice of a person sitting opposite you must come from the opposite direction. The gestures and hand movements too must be captured to some extent. facial expressions are to be captured accurately. Even accessories used at one place, say specs, must not disappear at other places.

India has a huge team of IT professionals, and an eco-system which will be helpful in building the metaverse. Metaverse is the collective result of the isolated immersive systems. Metaverse has entered into the consciousness of people the world over.

New prototypes of the VR and AR headsets must be developed and there should be continuous R&D in this area.

Software Programming

All the programming a software engineer does must be scalable, adaptable and maintainable. Technology advances by leaps and bounds, and so the code must have a forward-looking approach. There should attention to detail, and an eye on the larger picture. A designed software must be immune to tech changes. In the long-run, a machine-language-based solution must be scalable, adaptable and maintainable. The more the solution is modular, the better it is.

It is important to have the knowledge of the domain where the code will be deployed. If a developer understands the context, the programme will be both useful and valuable. A programmer, as a beginner, may not have complete domain knowledge, but should know more about the domain over a period of time.

A programmer must master a portfolio of technologies which can be put to use while developing products and solutions. A full-stack programmer is a valuable asset.

Programmes are infected by virus or bugs. Its presence in a place indicates that it may be present elsewhere too. It is better to fix the root cause. Youngsters must consult the senior colleagues while doing so. There are repetitive patterns in the code, and the bug too is likely to be present in such repetitions.

Spark

Data scientists have to learn Spark which is formally defined as the programming language based on Ada programming language. Apache Spark is an open source framework which focuses on interactive query, ML and real-time workloads.

Spark does not have its own storage system. It stores data in RDD on different partitions. It runs analytics on other storage systems such as Amazon S3, Amazon Redshift, Cassandra, Couchbase and others.

Spark is open source distributed processing system. It is suitable for Big Data workloads. It is an open source engine for any developer or data scientist dealing with Big Data. It is general purpose distributed data processing engine.

It utilizes in-memory caching, optimised query execution for fast analytic queries.

It provides for development of APIs in Jawa, Scala, Python and R.

It supports the coding use across multiple workloads, say batch processing, interactive queries, real-time analytics, ML and graph processing.

It is the most popular data distributed processing framework.

Spark SQL provides programming abstraction called DataFrames and also acts as a distributed SQL query engine.

The purpose is to create a new framework, optimised for fast iterative processing like ML, interactive data analysis. While doing so, it retains scalability and fault tolerance of Hadoop MapReduce.

Spark started as a research project at AMPLab, UC Berkley. Apache Spark was created by PhD scholars as a unified analytic tool and with many libraries for Big Data processing.

Apache Spark VS. Apache Hadoop

Hadoop is also an open source framework which has Hadoop Distributed File System (HDFs) as storage.

There is YARN as a way of managing computer resources used by different apps.

There is implementation of MapReduce programming module as an execution engine. The different execution engines are Spark, Tez and Presto.

Hadoop is a sequential multi-step process. At each step, it reads data from a cluster, and performs operations and gets results back to HDFS. Each step requires a disk read and write and hence jobs are slower because of latency of disk I/0.

Benefits of Spark

As there is in-memory caching, it is fast. There is optimised query execution.

It is developer friendly. It has a variety of languages to build apps, say Jawa, Scala, R and Python. These APIs make it easy for the developers.

It has the ability to run multiple workloads (including interactive queries), real-time analytics, ML and graph processing. One app can combine multiple workloads seamlessly.

Spark Workloads

There is Spark Core as the foundation for the platform. Then there is Spark SQL for interactive queries. Next we have Spark Streaming for real-time analytics. There is also MLlib for machine learning. Last there is GraphX for graph processing.

How Spark Scores over MapReduce?

Spark is an answer to the shortcomings of MapReduce. It does processing in-memory. It reduces the number of steps in a job. It reuses data across multiple parallel operations.

In Spark, only one step is needed for faster execution. It reuses data by using in-memory cache to accelerate ML algorithms that repeatedly call a function on the same dataset.

Data reuse is enabled by DataFrames. DataFrames are a collection of objects cached in memory and used in multiple Spark operations.

This lowers latency. Spark is several times faster than MapReduce for ML and interactive analytics.

Use Cases

Spark is used in financial services such as banking to assess the customer churn and to develop new financial products. In investment banking, it is used to analyse stock prices to predict future trends.

It is used in healthcare to provide comprehensive patient care. It helps frontline workers in patient interaction. It predicts future trends.

It is used in manufacturing to eliminate downtime of IoT devices by predicting preventive maintenance.

It is used in retail to attract and sustain customers by personalizing services and offers.

Spark and Cloud

It is an ideal workload in cloud. Cloud provides performance, scalability, reliability, availability and economies of scale.

Amazon EMR is the best place to deploy Apache Spark in the cloud.

Spark and Hadoop

Spark and Hadoop work better together. But Spark can be run in stand-alone made. Here you need resource manager such as CanN or Mesos.

Hadoop is not a pre-requisite to learn Spark. It is an independent project. It became popular because it runs on top of HDFS along with other Hadoop components.

Spark Core data processing engine works along with the libraries for SQL, ML, graph computation, stream processing. These can be used together in an application. ML is an iterative process that requires fast processing. Spark conducts in-memory data processing which makes this possible. Data scientists leverage the speed, ease and integration of Spark unified platform.

Learning Spark is not difficult if you have a basic understanding of Python or any programming language. It provides APIs in Java, Python, Scola.