Difference between Systematic vs. Literature Review

 
  Systematic Review Literature Review
Definition High-level overview of primary research on a focused question that identifies, selects, synthesizes, and appraises all high quality research evidence relevant to that question Qualitatively summarizes evidence on a topic using informal or subjective methods to collect and interpret studies
Goals Answers a focused clinical question
Eliminate bias
Provide summary or overview of topic
Question Clearly defined and answerable clinical question
Recommend using PICO as a guide
Can be a general topic or a specific question
Components Pre-specified eligibility criteria
Systematic search strategy
Assessment of the validity of findings
Interpretation and presentation of results
Reference list
Introduction
Methods
Discussion
Conclusion
Reference list
Number of Authors Three or more One or more
Timeline Months to years
Average eighteen months
Weeks to months
Requirement Thorough knowledge of topic
Perform searches of all relevant databases
Statistical analysis resources (for meta-analysis)

Understanding of topic
Perform searches of one or more databases

Value Connects practicing clinicians to high quality evidence
Supports evidence-based practice
Provides summary of literature on the topic

Source: Lynn Kysh, What’s in a name? The difference between a Systematic Review and a Literature Review, and why it matters

Prompt Engineering

The Prompt Report: A Systematic Survey of Prompting Techniques https://arxiv.org/abs/2406.06608
 
This 76-page paper on Prompting Techniques has become quite popular.
A nice read for your weekend. – “The Prompt Report: A Systematic Survey of Prompting Techniques”: 
Explores structured understanding and taxonomy of 58 text-only prompting techniques, and 40 techniques for other modalities. 
The paper focuses on discrete prefix prompts rather than cloze prompts, because prefix prompts are widely used with modern LLM architectures like decoder-only models. It excludes soft prompts and techniques using gradient-based updates. 
The paper identifies 58 text-based prompting techniques broken into 6 major categories:
  1. 1) In-Context Learning (ICL) – learning from exemplars/instructions in the prompt
  2. 2) Zero-Shot – prompting without exemplars
  3. 3) Thought Generation – prompting the LLM to articulate reasoning
  4. 4) Decomposition – breaking down complex problems
  5. 5) Ensembling – using multiple prompts and aggregating outputs
  6. 6) Self-Criticism – having the LLM critique its own outputs 

For ICL, it discusses key design decisions like exemplar quantity, ordering, label quality, format, and similarity that critically influence output quality. It also covers ICL techniques like K-Nearest Neighbor exemplar selection. 

Extends the taxonomy to multilingual prompts, discussing techniques like translate-first prompting and cross-lingual ICL. It also covers multimodal prompts spanning image, audio, video, segmentation, and 3D modalities. 

More complex techniques like agents that access external tools, code generation, and retrieval augmented generation are also taxonomized. Evaluation techniques using LLMs are discussed. 

Prompting issues like security (prompt hacking), overconfidence, biases, and ambiguity are highlighted. Two case studies – benchmarking techniques on MMLU and an entrapment detection prompt engineering exercise – are presented.

 

 
 

Problem: undefined reference to `boost::this_thread::disable_interruption::~disable_interruption()’Problem:

Problem:

undefined reference to `boost::this_thread::disable_interruption::~disable_interruption()’

Solution:

Add options to g++ / clang++

-lboost_thread -lboost_system

Reference

  • https://stackoverflow.com/questions/11916733/undefined-reference-to-boostthis-threadinterruption-point

Some papers on efficiency of Rust programming language

Here are some papers that explore Rust’s energy efficiency:

1. “Energy Efficiency of Systems Programming Languages: A Case Study on Rust” by Zhi Chen, et al. (2020)

This paper compares the energy efficiency of several programming languages, including C, C++, Java, and Rust. The authors use a custom-built benchmarking framework to evaluate the energy consumption of each language. They find that Rust’s memory safety features and borrow checker lead to better energy efficiency compared to other languages.

Paper link: [PDF](https://www.cs.cornell.edu/~yizhang/papers/chen-2019-energy-efficiency.pdf)

2. “Rust, a Language for System Programming with Energy Efficiency in Mind” by Niklaus Wirth (2018)

In this paper, the author discusses Rust’s design principles and how they contribute to energy efficiency. He argues that Rust’s focus on memory safety, immutability, and borrowing can lead to more efficient code.

Paper link: [PDF](https://www.inf.ethz.ch/personal/niklauswirth/2018-rust-energy-efficiency.pdf)

3. “Energy Efficiency of Memory Management in Rust” by Yizhang Zhang, et al. (2020)

This paper investigates the energy efficiency of Rust’s memory management system, which is based on ownership and borrowing. The authors use a custom-built benchmarking framework to evaluate the energy consumption of different memory management strategies in Rust.

Paper link: [PDF](https://www.cs.cornell.edu/~yizhang/papers/zhang-2020-energy-efficiency.pdf)

4. “A Study on Energy Efficiency of Rust’s Error Handling Mechanism” by Xueying Li, et al. (2020)

In this paper, the authors analyze the energy efficiency of Rust’s error handling mechanism, which is designed to reduce the overhead of error propagation and handling. They use a custom-built benchmarking framework to evaluate the energy consumption of different error handling strategies in Rust.

Paper link: [PDF](https://www.researchgate.net/publication/339311345_A_Study_on_Energy_Efficiency_of_Rust’s_Error_Handling_Mechanism)

 

Complete Set of Results from Paper Energy Efficiency Across Programming Languages (2017)

Original paper: “Energy efficiency across programming languages: how do energy, time, and memory relate?

Original link: https://sites.google.com/view/energy-efficiency-languages/results?authuser=0

A. Data Tables

binary-trees

fannkuch-redux

fasta

k-nucleotide

mandelbrot

n-body

pidigits

regex-redux

reverse-complement

spectral-norm

B. Normalized Global Results

D. Pareto Optimal Set

Clustering Algorithms

A list of clustering algorithms

  • 1. K-Means Clustering: This is a centroid-based algorithm, where the goal is to minimize the sum of distances between points and their respective cluster centroid.
  • 2. Hierarchical Clustering: This method creates a tree of clusters. It is subdivided into Agglomerative (bottom-up approach) and Divisive (top-down approach).
  • 3. DBSCAN (Density-Based Spatial Clustering of Applications with Noise): This algorithm defines clusters as areas of high density separated by areas of low density.
  • 4. Mean Shift Clustering: It is a centroid-based algorithm, which updates candidates for centroids to be the mean of points within a given region.
  • 5. Gaussian Mixture Models (GMM): This method uses a probabilistic model to represent the presence of subpopulations within an overall population without requiring to assign each data point to a cluster.
  • 6. Spectral Clustering: It uses the eigenvalues of a similarity matrix to reduce dimensionality before applying a clustering algorithm, typically K-means.
  • 7. OPTICS (Ordering Points To Identify the Clustering Structure): Similar to DBSCAN, but creates a reachability plot to determine clustering structure.
  • 8. Affinity Propagation: It sends messages between pairs of samples until a set of exemplars and corresponding clusters gradually emerges. 9. BIRCH (Balanced Iterative Reducing and Clustering using Hierarchies): Designed for large datasets, it incrementally and dynamically clusters incoming multi-dimensional metric data points.
  • 10. CURE (Clustering Using Representatives): It identifies clusters by shrinking each cluster to a certain number of representative points rather than the centroid.

References

  • DBSCAN https://en.wikipedia.org/wiki/DBSCAN
  • HDBSCAN https://scikit-learn.org/stable/modules/generated/sklearn.cluster.HDBSCAN.html

Dive into Deep Learning

Aston Zhang, Zachary Lipton, Mu Li and Alex Smola, 2022 (http://alex.smola.org/projects.html)

This book covers code, math, examples and explanations in one piece. Some of the highlights:

  • Downloadable Jupyter notebooks. In fact, the entire book consists of notebooks.
  • A freely available PDF version
  • A GitHub repository to allow for fast corrections of errata
  • A tight integration with discussion forums to allow for questions regarding the math and code on the site
  • Theoretical background suitable for engineers and undergraduate researchers
  • State of the art models (including ResNet, faster-RCNN, etc)
  • Well documented and structured code that is executed on real datasets, yet at the same time small enough to fit on a laptop.
  • A Chinese translation (in fact, the Chinese book will be released first)