Friday, December 29, 2023

Oracle Critical Patch Updates (CPU's)

 

Critical Patch Updates


Critical Patch Updates provide security patches for supported Oracle on-premises products. They are available to customers with valid support contracts. 

Starting in April 2022, Critical Patch Updates are released on the third Tuesday of January, April, July, and October (They were previously published on the Tuesday closest to the 17th day of January, April, July, and October). 


The next four dates are:


  • 16 January 2024
  • 16 April 2024
  • 16 July 2024
  • 15 October 2024


A pre-release announcement will be published on the Thursday preceding each Critical Patch Update release.

Wednesday, December 27, 2023

How to push from local repo to remote ?

 push  from local repo to remote



PS C:\git-lab\demo1> git remote -v

origin  https://github.com/anuragkumarjoy/terraform-code.git (fetch)

origin  https://github.com/anuragkumarjoy/terraform-code.git (push)

PS C:\git-lab\demo1>

PS C:\git-lab\demo1>




PS C:\git-lab\demo1> git push origin master

Enumerating objects: 5, done.

Counting objects: 100% (5/5), done.

Writing objects: 100% (3/3), 282 bytes | 282.00 KiB/s, done.

Total 3 (delta 0), reused 0 (delta 0), pack-reused 0

To https://github.com/anuragkumarjoy/terraform-code.git

   ae8129e..ea488b5  master -> master

PS C:\git-lab\demo1> 

PS C:\git-lab\demo1>




git push -f origin master (also work )

How to add git remote origin ?

 Add  git remote origin 


PS C:\git-lab\demo1> git remote -v

PS C:\git-lab\demo1> 

PS C:\git-lab\demo1>

PS C:\git-lab\demo1> git remote add origin https://github.com/anuragkumarjoy/terraform-code.git

PS C:\git-lab\demo1> 

PS C:\git-lab\demo1>

PS C:\git-lab\demo1> git remote -v

origin  https://github.com/anuragkumarjoy/terraform-code.git (fetch)

origin  https://github.com/anuragkumarjoy/terraform-code.git (push)

PS C:\git-lab\demo1>

PS C:\git-lab\demo1>

How to remove remote origin in git ?

 Remove git remote origin



PS C:\git-lab\demo1> git remote -v

origin  https://github.com/anuragkumarjoy/terraform-code.git (fetch)

origin  https://github.com/anuragkumarjoy/terraform-code.git (push)

PS C:\git-lab\demo1>

PS C:\git-lab\demo1>


PS C:\git-lab\demo1> git remote remove origin

PS C:\git-lab\demo1> 

PS C:\git-lab\demo1>

PS C:\git-lab\demo1> git remote -v

PS C:\git-lab\demo1> 

PS C:\git-lab\demo1>


Git push Error - fatal: credential-cache unavailable; no unix socket support

PS C:\git-lab\demo1>

PS C:\git-lab\demo1> git push --set-upstream origin master

fatal: credential-cache unavailable; no unix socket support

Everything up-to-date

branch 'master' set up to track 'origin/master'.

PS C:\git-lab\demo1>


PS C:\git-lab\demo1> 

PS C:\git-lab\demo1> git config --global --unset credential.helper

PS C:\git-lab\demo1> 



PS C:\git-lab\demo1> git push --set-upstream origin master

Everything up-to-date

branch 'master' set up to track 'origin/master'.

PS C:\git-lab\demo1>

PS C:\git-lab\demo1>

PS C:\git-lab\demo1>

Thursday, October 19, 2023

How to know Postgresql server uptime ?

 

Login as privileged user to database 


Query :


SELECT pg_postmaster_start_time();



output :


demo1=> SELECT pg_postmaster_start_time();

   pg_postmaster_start_time

-------------------------------

 2023-10-19 08:46:45.432054+00

(1 row)



demo1=>

Friday, September 22, 2023

Test Endpoint failed: Application-Status: 1020912, Application-Message: Cannot connect to ODBC provider ODBC general error., Application-Detailed-Message: RetCode: SQL_ERROR SqlState: HY000 NativeError: 1130 Message: [MySQL][ODBC 8.0(w) Driver]Host '10.1.1.12' is not allowed to connect to this MySQL server

 Test Endpoint failed: Application-Status: 1020912, Application-Message: Cannot connect to ODBC provider ODBC general error., Application-Detailed-Message: RetCode: SQL_ERROR SqlState: HY000 NativeError: 1130 Message: [MySQL][ODBC 8.0(w) Driver]Host '10.1.1.12' is not allowed to connect to this MySQL server



Solution :


You have to allow database port in Source security group and test the port using telnet 


Thursday, September 21, 2023

Copy data from one table to another table using procedure with oracle Hints

 

DECLARE

 i NUMBER := 0; 

 

 CURSOR G1 IS SELECT /*+ parallel(8) */ * FROM AWAIS; 

 

 BEGIN 

  FOR c1 in G1 LOOP 

    INSERT /*+ APPEND */ INTO AWAIS_STG (EMPLOYEE_ID, FIRST_NAME, LAST_NAME,EMAIL,PHONE_NUMBER,HIRE_DATE,JOB_ID,SALARY,COMMISSION_PCT,MANAGER_ID,DEPARTMENT_ID) VALUES (c1.EMPLOYEE_ID, c1.FIRST_NAME, c1.LAST_NAME,c1.EMAIL,c1.PHONE_NUMBER,c1.HIRE_DATE,c1.JOB_ID,c1.SALARY,c1.COMMISSION_PCT,c1.MANAGER_ID,c1.DEPARTMENT_ID) ; 

i:=i+1; 

if i> 4096

THEN 

  COMMIT; 

  DBMS_OUTPUT.PUT_LINE('Total rows: '|| 'COMMIT');

i:=0; 

    END IF; 

END LOOP;   

END;

Wednesday, September 20, 2023

How to copy data from one table to another table using PL/SQL procedure - cursor commit interval ?

 

Copy data from one table to another table using PL/SQL procedure - cursor 


DECLARE

 i NUMBER := 0; 

 

 CURSOR G1 IS SELECT * FROM AWAIS; 

 

 BEGIN 

  FOR c1 in G1 LOOP 

    INSERT INTO AWAIS_STG (EMPLOYEE_ID, FIRST_NAME, LAST_NAME,EMAIL,PHONE_NUMBER,HIRE_DATE,JOB_ID,SALARY,COMMISSION_PCT,MANAGER_ID,DEPARTMENT_ID) VALUES (c1.EMPLOYEE_ID, c1.FIRST_NAME, c1.LAST_NAME,c1.EMAIL,c1.PHONE_NUMBER,c1.HIRE_DATE,c1.JOB_ID,c1.SALARY,c1.COMMISSION_PCT,c1.MANAGER_ID,c1.DEPARTMENT_ID) ; 

i:=i+1; 

if i> 4096

THEN 

  COMMIT; 

  DBMS_OUTPUT.PUT_LINE('Total rows: '|| 'COMMIT');

i:=0; 

    END IF; 

END LOOP;   

END;

Sunday, September 17, 2023

How to strat Oracle database ?

 


YouTube Link : https://www.youtube.com/watch?v=hM-fHU92pV4&t=20s



https://www.youtube.com/watch?v=hM-fHU92pV4&t=20s

Friday, September 15, 2023

Privacy Policy

Privacy Policy

Last updated: September 15, 2023

This Privacy Policy describes Our policies and procedures on the collection, use and disclosure of Your information when You use the Service and tells You about Your privacy rights and how the law protects You.

We use Your Personal data to provide and improve the Service. By using the Service, You agree to the collection and use of information in accordance with this Privacy Policy. This Privacy Policy has been created with the help of the Free Privacy Policy Generator.

Interpretation and Definitions

Interpretation

The words of which the initial letter is capitalized have meanings defined under the following conditions. The following definitions shall have the same meaning regardless of whether they appear in singular or in plural.

Definitions

For the purposes of this Privacy Policy:

Account means a unique account created for You to access our Service or parts of our Service.

Affiliate means an entity that controls, is controlled by or is under common control with a party, where "control" means ownership of 50% or more of the shares, equity interest or other securities entitled to vote for election of directors or other managing authority.

Company (referred to as either "the Company", "We", "Us" or "Our" in this Agreement) refers to ORACLE DATABASE PROBLEM AND SOLUTIONS.

Cookies are small files that are placed on Your computer, mobile device or any other device by a website, containing the details of Your browsing history on that website among its many uses.

Country refers to: Haryana, India

Device means any device that can access the Service such as a computer, a cellphone or a digital tablet.

Personal Data is any information that relates to an identified or identifiable individual.

Service refers to the Website.

Service Provider means any natural or legal person who processes the data on behalf of the Company. It refers to third-party companies or individuals employed by the Company to facilitate the Service, to provide the Service on behalf of the Company, to perform services related to the Service or to assist the Company in analyzing how the Service is used.

Usage Data refers to data collected automatically, either generated by the use of the Service or from the Service infrastructure itself (for example, the duration of a page visit).

Website refers to ORACLE DATABASE PROBLEM AND SOLUTIONS, accessible from https://anuragkumarjoy.blogspot.com

You means the individual accessing or using the Service, or the company, or other legal entity on behalf of which such individual is accessing or using the Service, as applicable.

Collecting and Using Your Personal Data

Types of Data Collected

Personal Data

While using Our Service, We may ask You to provide Us with certain personally identifiable information that can be used to contact or identify You. Personally identifiable information may include, but is not limited to:

Email address

Usage Data

Usage Data

Usage Data is collected automatically when using the Service.

Usage Data may include information such as Your Device's Internet Protocol address (e.g. IP address), browser type, browser version, the pages of our Service that You visit, the time and date of Your visit, the time spent on those pages, unique device identifiers and other diagnostic data.

When You access the Service by or through a mobile device, We may collect certain information automatically, including, but not limited to, the type of mobile device You use, Your mobile device unique ID, the IP address of Your mobile device, Your mobile operating system, the type of mobile Internet browser You use, unique device identifiers and other diagnostic data.

We may also collect information that Your browser sends whenever You visit our Service or when You access the Service by or through a mobile device.

Tracking Technologies and Cookies

We use Cookies and similar tracking technologies to track the activity on Our Service and store certain information. Tracking technologies used are beacons, tags, and scripts to collect and track information and to improve and analyze Our Service. The technologies We use may include:

·       Cookies or Browser Cookies. A cookie is a small file placed on Your Device. You can instruct Your browser to refuse all Cookies or to indicate when a Cookie is being sent. However, if You do not accept Cookies, You may not be able to use some parts of our Service. Unless you have adjusted Your browser setting so that it will refuse Cookies, our Service may use Cookies.

·       Web Beacons. Certain sections of our Service and our emails may contain small electronic files known as web beacons (also referred to as clear gifs, pixel tags, and single-pixel gifs) that permit the Company, for example, to count users who have visited those pages or opened an email and for other related website statistics (for example, recording the popularity of a certain section and verifying system and server integrity).

Cookies can be "Persistent" or "Session" Cookies. Persistent Cookies remain on Your personal computer or mobile device when You go offline, while Session Cookies are deleted as soon as You close Your web browser. Learn more about cookies on the Free Privacy Policy website article.

We use both Session and Persistent Cookies for the purposes set out below:

Necessary / Essential Cookies

Type: Session Cookies

Administered by: Us

Purpose: These Cookies are essential to provide You with services available through the Website and to enable You to use some of its features. They help to authenticate users and prevent fraudulent use of user accounts. Without these Cookies, the services that You have asked for cannot be provided, and We only use these Cookies to provide You with those services.

Cookies Policy / Notice Acceptance Cookies

Type: Persistent Cookies

Administered by: Us

Purpose: These Cookies identify if users have accepted the use of cookies on the Website.

Functionality Cookies

Type: Persistent Cookies

Administered by: Us

Purpose: These Cookies allow us to remember choices You make when You use the Website, such as remembering your login details or language preference. The purpose of these Cookies is to provide You with a more personal experience and to avoid You having to re-enter your preferences every time You use the Website.

For more information about the cookies we use and your choices regarding cookies, please visit our Cookies Policy or the Cookies section of our Privacy Policy.

Use of Your Personal Data

The Company may use Personal Data for the following purposes:

To provide and maintain our Service, including to monitor the usage of our Service.

To manage Your Account: to manage Your registration as a user of the Service. The Personal Data You provide can give You access to different functionalities of the Service that are available to You as a registered user.

For the performance of a contract: the development, compliance and undertaking of the purchase contract for the products, items or services You have purchased or of any other contract with Us through the Service.

To contact You: To contact You by email, telephone calls, SMS, or other equivalent forms of electronic communication, such as a mobile application's push notifications regarding updates or informative communications related to the functionalities, products or contracted services, including the security updates, when necessary or reasonable for their implementation.

To provide You with news, special offers and general information about other goods, services and events which we offer that are similar to those that you have already purchased or enquired about unless You have opted not to receive such information.

To manage Your requests: To attend and manage Your requests to Us.

For business transfers: We may use Your information to evaluate or conduct a merger, divestiture, restructuring, reorganization, dissolution, or other sale or transfer of some or all of Our assets, whether as a going concern or as part of bankruptcy, liquidation, or similar proceeding, in which Personal Data held by Us about our Service users is among the assets transferred.

For other purposes: We may use Your information for other purposes, such as data analysis, identifying usage trends, determining the effectiveness of our promotional campaigns and to evaluate and improve our Service, products, services, marketing and your experience.

We may share Your personal information in the following situations:

·       With Service Providers: We may share Your personal information with Service Providers to monitor and analyze the use of our Service, to contact You.

·       For business transfers: We may share or transfer Your personal information in connection with, or during negotiations of, any merger, sale of Company assets, financing, or acquisition of all or a portion of Our business to another company.

·       With Affiliates: We may share Your information with Our affiliates, in which case we will require those affiliates to honor this Privacy Policy. Affiliates include Our parent company and any other subsidiaries, joint venture partners or other companies that We control or that are under common control with Us.

·       With business partners: We may share Your information with Our business partners to offer You certain products, services or promotions.

·       With other users: when You share personal information or otherwise interact in the public areas with other users, such information may be viewed by all users and may be publicly distributed outside.

·       With Your consent: We may disclose Your personal information for any other purpose with Your consent.

Retention of Your Personal Data

The Company will retain Your Personal Data only for as long as is necessary for the purposes set out in this Privacy Policy. We will retain and use Your Personal Data to the extent necessary to comply with our legal obligations (for example, if we are required to retain your data to comply with applicable laws), resolve disputes, and enforce our legal agreements and policies.

The Company will also retain Usage Data for internal analysis purposes. Usage Data is generally retained for a shorter period of time, except when this data is used to strengthen the security or to improve the functionality of Our Service, or We are legally obligated to retain this data for longer time periods.

Transfer of Your Personal Data

Your information, including Personal Data, is processed at the Company's operating offices and in any other places where the parties involved in the processing are located. It means that this information may be transferred to — and maintained on — computers located outside of Your state, province, country or other governmental jurisdiction where the data protection laws may differ than those from Your jurisdiction.

Your consent to this Privacy Policy followed by Your submission of such information represents Your agreement to that transfer.

The Company will take all steps reasonably necessary to ensure that Your data is treated securely and in accordance with this Privacy Policy and no transfer of Your Personal Data will take place to an organization or a country unless there are adequate controls in place including the security of Your data and other personal information.

Delete Your Personal Data

You have the right to delete or request that We assist in deleting the Personal Data that We have collected about You.

Our Service may give You the ability to delete certain information about You from within the Service.

You may update, amend, or delete Your information at any time by signing in to Your Account, if you have one, and visiting the account settings section that allows you to manage Your personal information. You may also contact Us to request access to, correct, or delete any personal information that You have provided to Us.

Please note, however, that We may need to retain certain information when we have a legal obligation or lawful basis to do so.

Disclosure of Your Personal Data

Business Transactions

If the Company is involved in a merger, acquisition or asset sale, Your Personal Data may be transferred. We will provide notice before Your Personal Data is transferred and becomes subject to a different Privacy Policy.

Law enforcement

Under certain circumstances, the Company may be required to disclose Your Personal Data if required to do so by law or in response to valid requests by public authorities (e.g. a court or a government agency).

Other legal requirements

The Company may disclose Your Personal Data in the good faith belief that such action is necessary to:

·       Comply with a legal obligation

·       Protect and defend the rights or property of the Company

·       Prevent or investigate possible wrongdoing in connection with the Service

·       Protect the personal safety of Users of the Service or the public

·       Protect against legal liability

Security of Your Personal Data

The security of Your Personal Data is important to Us, but remember that no method of transmission over the Internet, or method of electronic storage is 100% secure. While We strive to use commercially acceptable means to protect Your Personal Data, We cannot guarantee its absolute security.

Children's Privacy

Our Service does not address anyone under the age of 13. We do not knowingly collect personally identifiable information from anyone under the age of 13. If You are a parent or guardian and You are aware that Your child has provided Us with Personal Data, please contact Us. If We become aware that We have collected Personal Data from anyone under the age of 13 without verification of parental consent, We take steps to remove that information from Our servers.

If We need to rely on consent as a legal basis for processing Your information and Your country requires consent from a parent, We may require Your parent's consent before We collect and use that information.

Links to Other Websites

Our Service may contain links to other websites that are not operated by Us. If You click on a third party link, You will be directed to that third party's site. We strongly advise You to review the Privacy Policy of every site You visit.

We have no control over and assume no responsibility for the content, privacy policies or practices of any third party sites or services.

Changes to this Privacy Policy

We may update Our Privacy Policy from time to time. We will notify You of any changes by posting the new Privacy Policy on this page.

We will let You know via email and/or a prominent notice on Our Service, prior to the change becoming effective and update the "Last updated" date at the top of this Privacy Policy.

You are advised to review this Privacy Policy periodically for any changes. Changes to this Privacy Policy are effective when they are posted on this page.

Contact Us

If you have any questions about this Privacy Policy, You can contact us:

·       By email: anuragkumarjoy@gmail.com


Wednesday, September 6, 2023

Required Oracle RPM's install using yum commands on Linux

 Required Oracle RPM's install using yum commands 



login as root user 


yum install binutils -y

yum install compat-libstdc++-33 -y

yum install compat-libstdc++-33.i686 -y

yum install gcc -y

yum install gcc-c++ -y

yum install glibc -y

yum install glibc.i686 -y

yum install glibc-devel -y

yum install glibc-devel.i686 -y

yum install ksh -y

yum install libgcc -y

yum install libgcc.i686 -y

yum install libstdc++ -y

yum install libstdc++.i686 -y

yum install libstdc++-devel -y

yum install libstdc++-devel.i686 -y

yum install libaio -y

yum install libaio.i686 -y

yum install libaio-devel -y

yum install libaio-devel.i686 -y

yum install libXext -y

yum install libXext.i686 -y

yum install libXtst -y

yum install libXtst.i686 -y

yum install libX11 -y

yum install libX11.i686 -y

yum install libXau -y

yum install libXau.i686 -y

yum install libxcb -y

yum install libxcb.i686 -y

yum install libXi -y

yum install libXi.i686 -y

yum install make -y

yum install sysstat -y

yum install unixODBC -y

yum install unixODBC-devel -y

yum install zlib-devel -y

yum install elfutils-libelf-devel -y


####optional rpms 

yum install  xorg-x11-server-Xorg xorg-x11-xauth xorg-x11-apps -y   


yum install xdpyinfo*


Postres db Error : connection failed: FATAL: password authentication failed for user -AWS Aurora Postgres/RDS

Postgres Connection error : pgAdmin


connection failed: FATAL: password authentication failed for user "amd" connection to server at "dev-app-instance-1.gbhfthy.us-east-1.rds.amazonaws.com" (10.8767.11.10), port 5432 failed: FATAL: password authentication failed for user "amd"


Solution: 


create database abcd;

create user amd password 'amd';

grant connect on database abcd to rds_superuser;

grant  rds_superuser to amd;




Tuesday, September 5, 2023

Postgres db connection Error - Status : Failure -Test failed: FATAL: database "db_admin" does not exist

 

Status : Failure -Test failed: FATAL: database "db_admin" does not exist


Workaround: 

while connect using oracle sqlpdeveper to aurora postgres database/postgrs  database and you have a different user and db name. 



username: db_admin

password: password

Host: dev-app.cluster-abcsdedfff1afs.ue-east-1.rds.amazonaws.com/india?

Port: 5432








Solution - workaround




How to connect with PostgreSQL database using oracle SQLDeveloper ?

 

  • Check the current sqldeveloper tool version 





  • Download reqired postgres driver using below link 


https://jdbc.postgresql.org/download/


  • Configure & add driver in the sqldevloper tool as below  




  • After setup restart the sqldevloper tool  you will see the postgreSQL in option as below 






  • Use postgres db login details and make connection 





You did it

 !!!!! Congarutaions !!!!!



Monday, September 4, 2023

Oracle Database Release and Technical Support Roadmap

 

Oracle Database Release and support Timelines 




Friday, September 1, 2023

How to Fix psql: error: could not connect to server: No such file or directory in PostgreSQL

 

Error : psql: error: could not connect to server: No such file or directory


-bash-4.2$ psql

psql: error: could not connect to server: No such file or directory

        Is the server running locally and accepting

        connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?

-bash-4.2$



Check wheather postgres cluster is running or not using below comand 



-bash-4.2$ ps -ef | grep -i postgres

root      3318  3253  0 06:05 pts/0    00:00:00 sudo su - postgres

root      3319  3318  0 06:05 pts/0    00:00:00 su - postgres

postgres  3320  3319  0 06:05 pts/0    00:00:00 -bash

postgres  3494  3320  0 06:22 pts/0    00:00:00 ps -ef

postgres  3495  3320  0 06:22 pts/0    00:00:00 grep --color=auto -i postgres

-bash-4.2$

-bash-4.2$



If postgreSQL cluster is not running then strat using below command 



-bash-4.2$

-bash-4.2$ pg_ctl -D /postgres/data/ start

waiting for server to start....2023-09-01 06:23:16.881 UTC [3508] LOG:  starting PostgreSQL 12.16 on x86_64-pc-linux-gnu, compiled by gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44), 64-bit

2023-09-01 06:23:16.881 UTC [3508] LOG:  listening on IPv4 address "0.0.0.0", port 5432

2023-09-01 06:23:16.881 UTC [3508] LOG:  listening on IPv6 address "::", port 5432

2023-09-01 06:23:16.883 UTC [3508] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"

2023-09-01 06:23:16.887 UTC [3508] LOG:  listening on Unix socket "/tmp/.s.PGSQL.5432"

2023-09-01 06:23:16.917 UTC [3508] LOG:  redirecting log output to logging collector process

2023-09-01 06:23:16.917 UTC [3508] HINT:  Future log output will appear in directory "log".

 done

server started

-bash-4.2$

-bash-4.2$




verify using below commnads 



-bash-4.2$

-bash-4.2$

-bash-4.2$ ps -ef | grep -i postgres

root      3318  3253  0 06:05 pts/0    00:00:00 sudo su - postgres

root      3319  3318  0 06:05 pts/0    00:00:00 su - postgres

postgres  3320  3319  0 06:05 pts/0    00:00:00 -bash

postgres  3508     1  0 06:23 ?        00:00:00 /usr/pgsql-12/bin/postgres -D /postgres/data

postgres  3509  3508  0 06:23 ?        00:00:00 postgres: logger

postgres  3511  3508  0 06:23 ?        00:00:00 postgres: checkpointer

postgres  3512  3508  0 06:23 ?        00:00:00 postgres: background writer

postgres  3513  3508  0 06:23 ?        00:00:00 postgres: walwriter

postgres  3514  3508  0 06:23 ?        00:00:00 postgres: autovacuum launcher

postgres  3516  3508  0 06:23 ?        00:00:00 postgres: stats collector

postgres  3517  3508  0 06:23 ?        00:00:00 postgres: logical replication launcher

postgres  3525  3320  0 06:23 pts/0    00:00:00 ps -ef

postgres  3526  3320  0 06:23 pts/0    00:00:00 grep --color=auto -i postgres

-bash-4.2$




-bash-4.2$

-bash-4.2$ psql

psql (12.16)

Type "help" for help.


postgres=# select version();

                                                 version

----------------------------------------------------------------------------------------------------------

 PostgreSQL 12.16 on x86_64-pc-linux-gnu, compiled by gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44), 64-bit

(1 row)


postgres=#

postgres=#


Saturday, August 26, 2023

How to Lock the user account in Linux ?

 


[root@10.1.1.1 ~]#

[root@10.1.1.1 ~]# passwd -l postgres

Locking password for user postgres.

passwd: Success


Check the user account status :


[root@10.1.1.1 ~]#

[root@10.1.1.1 ~]#

[root@10.1.1.1 ~]# passwd -S postgres

postgres LK 2022-10-04 -1 -1 -1 -1 (Password locked.)

[root@10.1.1.1 ~]#

[root@10.1.1.1 ~]#

How to check the lock status of any Linux user Account ?

 


If user Locked 


[root@10.1.1.1 ~]#

[root@10.1.1.1 ~]# passwd -S postgres

postgres LK 2022-10-04 -1 -1 -1 -1 (Password locked.)

[root@10.1.1.1 ~]#

[root@10.1.1.1 ~]#



If user Unlocked



[root@10.1.1.1 ~]#

[root@10.1.1.1 ~]#

[root@10.1.1.1 ~]# passwd -S postgres

postgres PS 2022-10-04 -1 -1 -1 -1 (Password set, SHA512 crypt.)

[root@10.1.1.1 ~]#

[root@10.1.1.1 ~]#



========================

LK: Password locked

NP: No password

PS: Password set

Thursday, August 24, 2023

"ORA-00942 : table or view does not exist. " on synonym

"ORA-00942- table or view does not exist" error you are getting , this indicate that user1 does not have privileges on the user.test table (get the details from synonyms SQL which tables are included ) .


We need to grant that required privileges 


SQL> GRANT SELECT, INSERT, UPDATE, DELETE 

     ON USER.TEST 

     TO USER1;



Note : Create dynamic sql to refersh the privelige if multiple tables and schema included 

Wednesday, August 23, 2023

How to chnage the global name of a database in AWS oracle RDS ?

 

We have to login as admin user 


exceute below commands 


EXEC rdsadmin.rdsadmin_util.rename_global_name(p_new_global_name => 'new_global_name');

Wednesday, August 16, 2023

What is Git ?

 


  • Git is created by Linus Torvalds in 2005

  • Git is the distributed version control system. This stored all version properly  and documented very well 

  • Git has ability to track of changes to content and provide mechanisms for sharing the content with other in team

  • This is open-source and free software

  • Available for multiple platforms (Linux, Unix, Mac, Windows )

  • Active community and more than 70%  developer used 

Wednesday, August 2, 2023

remote-exec Provisioner in terraform

 ##### remote-exec Provisioner demo


resource "aws_instance" "dev-instance" {


  ami           = "ami-05548f9cecf47b442"

  instance_type = "t2.micro"

  key_name = "pgsql_key"

  


  connection {

    type = "ssh"

    user = "ec2-user"

    private_key = file("./pgsql_key.pem")

    host = self.public_ip


  }


  provisioner "remote-exec" {


    inline = [ 

        "sudo yum install -y mysql*"

     ]

    

  }


}



Note: we can use existing key or generate new key 

Friday, July 28, 2023

Planning failed. Terraform encountered an error while generating this plan.

 


Error: invalid value for name (must only contain alphanumeric characters, hyphens, underscores, commas, periods, @ symbols, plus and equals signs)




Solutions : please check data type in associtaed variable block 

Thursday, July 27, 2023

How to use variable in terraform ?

 

main.tf



provider "aws" {

    region = "us-east-1"

  

}









resource "aws_security_group" "allow_ssh" {

    name = "allow_ssh_oracle_mssql_port"

    description = "allow_ssh_oracle_mssql_port "


    ingress {

    description      = "ssh-port"

    from_port        = 22

    to_port          = 22

    protocol         = "tcp"

    cidr_blocks      = [var.my_ip]

  }



  ingress {

    description      = "Oracle-port "

    from_port        = 1521

    to_port          = 1521

    protocol         = "tcp"

    cidr_blocks      = [var.my_ip]

  }



  ingress {

    description      = "MSSQL-Port"

    from_port        = 1433

    to_port          = 1433

    protocol         = "tcp"

    cidr_blocks      = [var.my_ip]

  }


  egress {

    from_port        = 0

    to_port          = 0

    protocol         = "-1"

    cidr_blocks      = ["0.0.0.0/0"]

    #ipv6_cidr_blocks = [var.my_ip]

  }


  tags = {

    Name = "allow_imp_port_Oracle_MSSQL_ssh"

  }

  




variables.tf


variable "my_ip" {

  default = "34.207.6.67/32"

}






Terraform commands


terraform init

terraform plan

terraform apply





How to Print output from resources using terraform ?

test_output.tf 


provider "aws" {

    region = "us-east-1"

  

}




resource "aws_eip" "eip1" {

    domain = "vpc"

  

}



output "My_Public_ip" {

  

   value = aws_eip.eip1.public_ip

}






Terraform commands


terraform init

terraform plan

terraform apply





Tuesday, July 25, 2023

How to Create github repository using terraform ?

terraform {

  required_providers {

    github = {

      source  = "integrations/github"

      version = "~> 5.0"

    }

  }

}


# Configure the GitHub Provider

provider "github" {


    token = "Put-Github-Token"

}



resource "github_repository" "MyFirstRepo" {

  name = "terraform-module1"

  description = "This Module contain code for Start"

  visibility = "public"

}



Terraform commands


terraform init

terraform plan

terraform apply

How to Create / launch AWS EC2 instance using Terraform ?

MyFirst-EC2.tf



####Define credential to connect with providers


provider "aws" {

    region = "us-east-1"

    access_key = "Put-Your-Access_Key"

    secret_key = "Put-Your-Secret_Key"

  

}



#### Create Resource i.e. EC2 


resource "aws_instance" "MyEc2" {

 ami = "ami-05548f9cecf47b442"  

 instance_type = "t2.micro"


 

}





Terraform commands 


terraform init
terraform plan
terraform apply

Tuesday, July 18, 2023

AWS Oracle Classic RDS Features & Limitataion


Edition and licensing => License included (SE2) or BYOL (SE2/EE)

Multitenant PDB in CDB (licence & Maint cost reduction)  => 1 PDB

No of Database => 1 DB per RDS instance

Applications require access to underlying OS =>  N/A

Applications require access to underlying DB Env =>  N/A

sysdba privilege access  => N/A

Huge Pages (better Performance) => N/A

Local Undo => N/A

Data Guard Max Availability => with Multi AZ (no access on secondary) 

RDS database Patching =>  AWS Managed

Create a Multi-AZ deployment => Yes

Change CPU cores and threads per core on DB instance class => Yes

Turn on storage autoscaling => Yes

Automatic minor version upgrade => Yes

Kerberos authentication. => Yes

Performance Insights => Yes

TDE Encryption => Yes

Billing can be stopped => Yes

Scaling AWS          => Yes

High availability =>  AWS Managed

Database backups =>  AWS  Managed

Database software install => AWS Managed

OS patching =>  AWS Managed

OS installation => AWS Managed

Storage Limit => 64TB

Monday, July 17, 2023

Migrate Oracle Database to Any Cloud (AWS,Azure,Google,IBM etc ) Using Native Tool - Datagaurd

Oracle Database Migration to Any Cloud (AWS,Azure etc )


 Pre-communication (No-Downtime)

  • Ensure and agreed with plan & cutover window 
  • Ensure all stakeholder should be on same page 
  • Ensure everyone copid in mail and meetings
  • Ensure all issue should be documented database & application wise during iteration/testing
  • Dicussed and aggred who is from available during cutover winodw (application owner,database lead,infa lead etc)   


Pre-Migration Pahse (No-Downtime) 

On Source Database 

  • Delete/Remove all backup tables /unused objects from database with help of application team 
  • Do username & password backup  
  • Get read only access to database server
  • Ensure db_unique_name is the same as the db_name
  • Set and crosscheck all parameter related to dataguard configuration 
  • check remote connectivity is working fine and port should be open at firewall level 
  • check and validate database corruption 
  • prepare & test  shared mout point  (NFS/EFS/S3FS)  
  • Take the database backup including archivelogs and standby control file to create standby database
  • check and backup networkfiles (tnsnames.ora,listiner.ora)
  • check if wallet is configured
  • check if any encryption (TDE )is enables 


Prepare Target envirnoment (No-Downtime) 

  • Install oracle binrary software & Build the target database server 
    • Install Grid software if required 
    • Install Oracle database binary 
  • Configure & enable SSH & SCP (with help of root account - Infra team)
  • Configure and validate all  mount points (database/aplication)
  • Configure and validate same shared  mount points (NFS/EFS/S3FS) in the target server/s.
  • Restore the database backup and configure standby database 
  • Incase ASM add spfile to grid software using srvctl 
  • Validate remote database connctivity  
  • Setup dataguard replication  from primary (source) to standby (target) database 


Migration pahse (Downtime start)

  • check and validate primary and standby database in sync and no GAP 
  • Stop Application associated database 
  • create test user/test tablespace and test table on primary database 
  • Do 4-5 archive log switch on primary (source) database .
  • Again check  and validate primary and standby database in sync and no GAP
  • Check active database sessions in primary (source)  database. if still acitve after stopping the application ,  kill those users  
  • Create a guaranteed flashback restore point - for this enough FRA space should be available 
  • Multiple way to do 
    • Using manual swithover 
    • Using DGMGRL ( need to configure DGMGRL )
    • Using manual Failover (No rollback in this case) 
  • Validate the migated database
  • Setup & validate connection with application  
  • Configure & update on premise DNS with new IP address (target) and create load balancer rules (if using)
  • Handover to application team for validation 

Post Migration phase (No-downtime) 

  • Take golden database backup after successful migration 
  • Configure & schedule  database jobs 
  • Configure & setup database monitoring


Rollback Phase 


  • Stop the application on target database 
  • Rollback the database using restore point (source database)
  • Update the DNS with old entry for rollback (if required)
  • Start and validate the database 
  • Connect with database & start the application