Saturday, 22 February 2020

Top CSS Interview Questions

These CSS interview questions are based on my personal interview experience. Likelihood of question being asked in the interview is from top to bottom.

1. Do you know about Box model ? What are the different values of box-sizing css?

    Margin-border-padding-content

2. What are the css precedence rules and css specificity ?

Css specificity is the set of rules applied to determine which style is applied to an element.
Order of css specificity from highest to lowest is as follows:-



  • A css rule with !important always takes precedence.
  • If element is having inline style ( element style attribute ) overrides all other styles (1,0,0,0)
  • A css selector using element id (element id attribute) (0,1,0,0)
  • A css selector using class, pseudo class or attribute (0,0,1,0)
  • A css selector using element name (for e.g. h1, p, div) (0,0,0,1)
  • A css selector appear later in the code override earlier one if both have the same specificity
  • Good read to understand how to calculate specificity:- 

    3. Difference between visibility:hidden; and display:none;

    visibilty:hidden element is not visible but space is allocated.
    display:none      element is not displayed and space is not allocated.

    4. How to align a block element inside another element.

    5. Difference between Static, Relative, Absolute and Fixed position

    6. What is shadow dom

    7. Pseudo elements

    ::before
    ::after

    8. Pseudo classes

    :active
    :hover
    :nth-child(n)

    9. Media query in CSS3

    10. em vs rem vs px ? Which unit of measurement you should use and when in css ?




    Top Javascript Interview Questions

    These Javascript interview questions are based on my personal interview experience. Likelihood of question being asked in the interview is from top to bottom.

    1. What is closure and how do we use it ?

    2. What is promises and why do we use it ?

    3. How to use async and await and what problem it solves ?

    4. Difference between call, apply and bind ?

    5. What is Arrow functions and how it is different from normal functions ?

    6. Difference between "==" and "===" ?

    7. Difference between var, let and const keywords ?

    8. Difference between null and undefined ?

    9. Object Oriented programming in Javascript - What is constructor and prototype? Do you know about prototypal inheritance. How do you define Class using ES6.

    10. What are the methods available in Object -

    create It is used to create a shallow copy of Object. Shallow copy means it copies only the properties and functions of the object but doesn't copy prototype inheritance.
    assign- It is used to copy the properties and functions of one object in another object. doesn't copy prototype inheritance.
    setPrototypeOf
    freeze- It is used to freeze object. New properties cannot be added and existing properties cannot be modified.
    seal- It is used to seal the schema of object. New properties cannot be added but existing properties can be modified.
    defineProperty- It is used to define property. Can make the property readonly using this method.

    11. Currying functions in javascript?

    12. Lexical scoping and variable hoisting in Javascript?




    Thursday, 4 July 2019

    Ping Pong using wait notify

    package com.help4j.core.thread;
    
    public class PingPong {
    
     public static void main(String[] args) {
      Object LOCK_OBJECT = new Object();
      Thread ping = new Thread(new PingPongThread(LOCK_OBJECT, "Ping"));
      Thread pong = new Thread(new PingPongThread(LOCK_OBJECT, "Pong"));
      ping.start();
      pong.start();
     }
     
    }
    
    
    class PingPongThread implements Runnable{
     
     private Object LOCK_OBJECT;
     private String name;
     
     public PingPongThread(Object LOCK_OBJECT, String name) {
      this.LOCK_OBJECT = LOCK_OBJECT;
      this.name = name;
     }
     
     @Override
     public void run() {
      synchronized (LOCK_OBJECT) {
       while(true) {
        System.out.println(name);
        
        try {
         Thread.sleep(1000);
        } catch (InterruptedException e1) {
         e1.printStackTrace();
        }
        
        LOCK_OBJECT.notify();
        
        try {
         LOCK_OBJECT.wait(1000);
        } catch (InterruptedException e) {
         e.printStackTrace();
        }
       }   
      }  
     }
    }

    Thursday, 13 June 2019

    Virtual DOM vs Shadow DOM

    DOM

    DOM stands for Document Object Model which is object based representation of any structured content such as HTML, XML etc. Browser maintains DOM object to render the HTML and any change in the DOM object cause re-rendering of the whole page.

    Virtual DOM

    Concept of Virtual DOM is adapted by popular UI libraries like React and Vue to solve mainly the performance issue. 
    Virtual DOM is an in-memory representation of the DOM. Any update in the DOM first applied to Virtual DOM instead of applying directly to actual DOM. Then it compare the changes against actual DOM through a process call "diffing" and apply the changes efficiently to actual DOM by only re-rendering the changed elements.

    Shadow DOM

    Concept of Shadow DOM is natively supported by browser (Not all browsers are currently supporting). Shadow DOM is mostly about encapsulation of the implementation. You can make reusable native web components which follows Shadow DOM concept. Implementation and styling of native web component is hidden within the Shadow DOM and having no impact from the outer DOM.

    Saturday, 18 May 2019

    Kafka Streams

    Stateless operators

    • branch
    • filter
    • inverseFilter
    • flatMap
    • flatMapValues
    • foreach
    • groupByKey
    • groupBy
    • map
    • mapValues

    Stateful operators

    • join
    • aggregate
    • count
    • reduce
    • windowing

    Window

    1. Tumbling window
      • Time based, Fixed Size, Non overlapping, Gap less windows
      • For e.g. if window-size=5min and advance-interval =5min then it looks like [0-5min] [5min-10min] [10min-15min].....
    2. Hopping window
      • Time based, Fixed Size, Overlapping windows
      • For e.g. if widow-size=5min and advance-interval=3min then it looks like [0-5min] [3min-8min] [6min-11min]......
    3. Sliding window
      • Fixed size overlapping window that works on the difference between record timestamp
      • Used only for join operation
    4. Session window
      • Session based, Dynamically sized, Non overlapping, Data driven window.
      • Used to aggregate key based events into session.
    For more information on windowing, refer Apache Kafka Documentation

    Sunday, 12 May 2019

    Confluent Schema Registry


    Avro

    Primitive Types

    1. null
    2. boolean
    3. int (32 bit)
    4. long (64 bit)
    5. float (32 bit)
    6. double (64 bit)
    7. byte[] (8 bit)
    8. string (char squence)

    Complex Types

    1. record
    2. enum
    3. array
    4. map
    5. union
    6. fixed

    Avro Schema Definition

    • namespace (required)
    • type (required) => record, enum, array, map, union, fixed
    • name (required)
    • doc (optional)
    • aliases (optional)
    • fields (required)
      • name (required)
      • type (required)
      • doc (optional)
      • default (optional)
      • order (optional)
      • aliases (optional)

    Confluent Schema Registry

    • Schema Registry stores all schemas in a Kafka topic defined by kafkastore.config=_schemas (default) which is a single partition topic with log compacted.
    • The default response media type application/vnd.schemaregistry.v1+json, application/vnd.schemaregistry+json, application/json are used in response header.
    • HTTP and HTTPS client protocol are supported for schema registry.
    • Prefix to apply to metric names for the default JMX reporter kafka.schema.registry
    • Default port for listener is 8081
    • Confluent support primitive types of null, Boolean, Integer, Long, Float, Double, String, byte[], and complex type of IndexedRecord. Sending data of other types to KafkaAvroSerializer will cause a SerializationException

    Schema Compatibility Types

    1. BACKWARD
      • Consumer using schema X can process data produced with schema X or X-1. In case of BACKWARD_TRANSITIVE, consumer using schema X can process data produced with all previous schema X, X-1, X-2 and so on
      • Delete field without default value (Required field) is allowed. In this case, Consumer ignore this field.
      • Add field with default value (Optional field) is allowed. In this case, Consumer will assign default value.
      • BACKWARD is default compatibility type in confluent schema registry.
      • There is no assurance that consumers using older schema can read data produced using the new schema. Therefore, upgrade all consumers before you start producing new events.
    2. FORWARD
      • Data produced using schema X can be ready by consumers with schema X or X-1. In case of FORWARD_TRANSITIVE, data produced using schema X can be ready by consumers with all previous schema X, X-1, X-2 and so on
      • Add field without default value (Required field) is allowed. In this case, Consumer ignore this field.
      • Delete field with default value (Optional field) is allowed. In this case, Consumer will assign default value.
      • There is no assurance that consumers using the new schema can read data produced using older schema. Therefore, first upgrade all producers to using the new schema and make sure the data already produced using the older schema are not available to consumers, then upgrade the consumers.
    3. FULL
      • Backward and forward compatible between schema X and X-1. In case of FULL_TRANSITIVE, backward and forward compatible between all previous schema X and X-1 and X-2 and so on
      • Modify field with default value (Optional field) is allowed.
      • There are assurances that consumers using older schema can read data produced using the new schema and that consumers using the new schema can read data produced using older schema. Therefore, you can upgrade the producers and consumers independently.
    4. NONE
      • Compatibility type means schema compatibility checks are disabled.
      • Upgrading Consumer or Producer depends. For example, modifying a field type from Number to String. In this case, you will either need to upgrade all producers and consumers to the new schema version at the same time


    Saturday, 11 May 2019

    Kafka Consumer Using Java

    package com.abc.demo;
    
    import java.time.Duration;
    import java.util.Collections;
    import java.util.Properties;
    import java.util.concurrent.ExecutionException;
    
    import org.apache.kafka.clients.consumer.ConsumerConfig;
    import org.apache.kafka.clients.consumer.ConsumerRecord;
    import org.apache.kafka.clients.consumer.ConsumerRecords;
    import org.apache.kafka.clients.consumer.KafkaConsumer;
    import org.apache.kafka.common.serialization.StringDeserializer;
    
    public class KafkaConsumerTest {
    
     public static void main(String[] args) throws InterruptedException, ExecutionException{
      //Create consumer property
      String bootstrapServer = "localhost:9092";
      String groupId = "my-first-consumer-group";
      String topicName = "my-first-topic";
      
      Properties properties = new Properties();
      properties.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServer);
      properties.setProperty(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
      properties.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
      properties.setProperty(ConsumerConfig.GROUP_ID_CONFIG, groupId);
      properties.setProperty(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
      properties.setProperty(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "false");
      
      //Create consumer
      KafkaConsumer<String, String> consumer = new KafkaConsumer<>(properties);
      
      //Subscribe consumer to topic(s)
      consumer.subscribe(Collections.singleton(topicName));
      
      
      //Poll for new data
      while(true){
       ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(1000));
       
       for(ConsumerRecord<String, String> record: records){
        System.out.println(record.key() + record.value());
        System.out.println(record.topic() + record.partition() + record.offset());
       }
       
       //Commit consumer offset manually (recommended)
       consumer.commitAsync();
      }
      
     }
    }

    Top CSS Interview Questions

    These CSS interview questions are based on my personal interview experience. Likelihood of question being asked in the interview is from to...