Wednesday, November 25, 2020

ServiceNow certification

I got my ServiceNow Admin certification more than 2 year ago and since then I took a few delta exams, an open-book exam about changes introduced by new releases. Until now those delta exams were free, and mandatory to maintain the certification, but with current release ("Paris") they introduced a $200 annual fee.

It was announced almost half a year ago but I didn't actually pay attention to that small detail. Today I received an email reminding me the calendar, as the exam should happen before Jan 7th, 2021.

Sorry ServiceNow, but I think I'll just let this certification expire. My initial plans were different and didn't use this certification at all, so paying $200 / year just to maintain it is out of question. I intend to maintain some other certifications I have, and I'd do it for ServiceNow, too, if they were free, but not in these new conditions they introduced.

Wednesday, November 11, 2020

.NET 5.0

Launched yesterday at dotnetconf.net I've just installed .NET 5.0:

  • Downloaded and installed .NET 5 SDK. Checked.
  • Updated Visual Studio to 18.6. Checked.
  • Read the release notes. Checked.
  • Read EF Core 5.0 release notes. Checked.
  • Watch dotnetconf.net. Checked.
No more release candidate version, and no more .NET Core (except EF), but one unified platform. I'd like that, when all parts will be ready.

Monday, November 2, 2020

Salesforce platform developer 1 certification

 As planned, today I took and happily passed my 2nd Salesforce certification: platform developer 1.


The next one for Salesforce Application Architect credential path will have to wait a while, as the preparation for it will take a bit longer, but hopefully will come.

Wednesday, October 28, 2020

Salesforce certifications

 I played with my own org in Salesforce for a while, but lately a few things aligned, mainly the company I'm working for just moved to Salesforce. It was a pretty big move, delivered by a big name in consulting services, but now slowly we (app dev department) have to take more and more on the support part. So I decided to get certified with Salesforce.

Besides this, the company also moved to DocuSign (of course, for e-signatures), which happily integrates with Salesforce. Again we had some consultants working on this, but that sparked also my interest.

DocuSign has free developer instances, which I registered and integrated with Salesforce (dev org). Then, registered with DocuSign University (separate registration), reading the docs, testing and tried (successfully) a first certification: DocuSign eSignature Template Specialist 2020. Then a more important one I was interested, DocuSign eSignature for Salesforce Specialist 2020 (after reading and testing the manual + University exercises). Yay! It worked, and even better, all free.

Then the focus was back to Salesforce certification, which I prepared for quite some time. First one is App Builder which I passed 2 days ago:


Yay again! Next one is logically Salesforce Platform Developer 1, which I intend to try in a few days. Hopefully I'll have good news on this one too, very soon.

Keeping busy these special conditions of working from home.

Monday, July 20, 2020

ASP.NET Core trick: breakpoint for 404 errors

I had an 404 error for a test web app that used a NGINX reverse-proxy config file, which was pretty hard to track it down. Most of the routes worked fine, except pages under an /admin folder, all throwing 404 error, page not found.

The easiest thing to track it down is a break-point somewhere, but where exactly, for a page that's not found? So, I added temporarily a middleware in Startup.cs, by the book:

app.Use(async (context, next) =>
{
// Do work that doesn't write to the Response.
await next.Invoke();
// Do logging or other work that doesn't write to the Response.
});

Then just add the break-point on highlighted line and inspect the Request object.

In my case I only needed a backslash in proxy_pass parameter of NGINX's config file.

Monday, May 25, 2020

IPython Notebook to connect to SQL Server 2019

I installed Anaconda 3 some time ago, and then I installed SQL Server 2019 without ML libraries for Python. Now I just wanted to connect from an IPython Notebook to this local named instance of SQL Server, directly.

I tried with both pyodbc (which worked from first try) and pymssql which seemed to have some issue. The error message was:

"MSSQLDatabaseException: (20009, b'DB-Lib error message 20009, severity 9:\nUnable to connect: Adaptive Server is unavailable or does not exist (localhost\\SQLSRV2019)\n')"


It looks like pymssql library needs the TCP/IP connection configuration to be enabled, and also needs SQL Server Browser service to run. So, start SQL Server 2019 Configuration Manager and then Enable TCP/IP configuration setting:



And in Windows Services.msc make sure SQL Server Browser runs:



After these changes restart SQL Server 2019 instance (don't forget this step).

The code for pyodbc:
import pyodbc conn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER=.\SQLSRV2019;DATABASE=db1;UID=sa;PWD=***') cursor = conn.cursor() cursor.execute("select id1, name1 from Table1") rows = cursor.fetchall() print(rows)
The code for pymssql with Pandas DataFrame:
import pymssql import pandas as pd conn = pymssql.connect(user = 'sa', password = '***', host='.\SQLSRV2019', database = 'db1') q = pd.read_sql_query('select name1, id1 from Table1', conn) conn.close() df = pd.DataFrame(q, columns=['name1','id1']) print(df)
Then let's plot the values with matplotlib:
import matplotlib.pyplot as plt
df.plot(kind='bar')
You can try these with either JupyterLab / Jupyter Notebook from Anaconda 3 or even with VS Code extension for Anaconda.

Monday, February 10, 2020

Azure Key Vault—Private endpoints now available in preview

"Establish a private connection between Azure Key Vault and other Azure services by using Azure Private Link, now available in preview for all public regions.
[...]
All traffic to the service can be routed through the private endpoint, so no gateways, NAT devices, ExpressRoute or VPN connections, or public IP addresses are needed. Traffic between your virtual network and the service traverses over the Microsoft backbone network, eliminating exposure from the public Internet."

A question may be why only now? It should have been there from the beginning.

Docs link.